Submitted by webmaster on
Title | MIAMI: A Framework for Application Performance Diagnosis |
Publication Type | Conference Paper |
Year of Publication | 2014 |
Authors | Marin, G., J. Dongarra, and D. Terpstra |
Conference Name | IPASS-2014 |
Date Published | 2014-03 |
Publisher | IEEE |
Conference Location | Monterey, CA |
ISBN Number | 978-1-4799-3604-5 |
Abstract | A typical application tuning cycle repeats the following three steps in a loop: performance measurement, analysis of results, and code refactoring. While performance measurement is well covered by existing tools, analysis of results to understand the main sources of inefficiency and to identify opportunities for optimization is generally left to the user. Today’s state of the art performance analysis tools use instrumentation or hardware counter sampling to measure the performance of interactions between code and the target architecture during execution. Such measurements are useful to identify hotspots in applications, places where execution time is spent or where cache misses are incurred. However, explanatory understanding of tuning opportunities requires a more detailed, mechanistic modeling approach. This paper presents MIAMI (Machine Independent Application Models for performance Insight), a set of tools for automatic performance diagnosis. MIAMI uses application characterization and models of target architectures to reason about an application’s performance. MIAMI uses a modeling approach based on first-order principles to identify performance bottlenecks, pinpoint optimization opportunities, and compute bounds on the potential for improvement. |
DOI | 10.1109/ISPASS.2014.6844480 |