Top Banner
Performance Tools Working Group August 12, 2007 Washington, DC Chairs: Dan Reed, Renaissance Computing Institute (RENCI) Bernd Mohr, Research Centre Juelich
17

Performance Tools Working Group August DC

Mar 19, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Performance Tools Working Group August DC

Performance Tools Working GroupAugust 1‐2, 2007Washington, DC

Chairs:

Dan Reed, Renaissance Computing Institute (RENCI)

Bernd Mohr, Research Centre Juelich

Page 2: Performance Tools Working Group August DC

Thank You

Thanks to the group for valuable insights– lots of interchange and good ideas

We have tried to capture the key ideas– any errors are Bernd/Dan’s 

2 August 2007 2

Page 3: Performance Tools Working Group August DC

Our Charge: Performance Tools

Topics– analysis, modeling and  optimization

– interactive and automatic approaches

– data management and instrumentation

– hardware  and OS support

– visualization and presentation

– etc

Current  status

Petascale requirements

Findings

Recommendations– ordered priority list

– challenge type• technical, funding

• policy, training

– impact • high, medium,  low

– Probability (risk)• high, medium, low

3

Page 4: Performance Tools Working Group August DC

4

On Performance Tools …

Sequential Trans‐petascaleTerascale

Page 5: Performance Tools Working Group August DC

Ecosystem Roles and Interactions

Less disjoint than one might think– small community with deep and long history

2 August 2007 5

Academia

Laboratories

Government

Industry

Page 6: Performance Tools Working Group August DC

Performance Measurement: A Status Report

Intelligent use well understood– instrumentation, measurement and analysis

Instrumentation techniques– FORTRAN, C, less C++, …

– MPI, user functions/regions, less OpenMP, …

Measurement  techniques– sampling, profiling and tracing

Analysis techniques (weakest of the three)– too much data, not enough analysis

– tools find symptoms, but not root problems

Page 7: Performance Tools Working Group August DC

Performance Measurement: A Status Report

Effective techniques for homogeneous systems– heterogeneous challenges coming

• multicore, specialized processors, …

Too much concentration on time as a metric– need more support for memory analysis, …

Largely left to the user– true analysis, then optimization/tuning

Page 8: Performance Tools Working Group August DC

Performance Modeling: A Status Report

Limited breakout group discussion– not a reflection of lack of importance

Multiple meanings/uses of modeling– system characterization

– application prediction, …

Opinion– better than in the measurement community

Page 9: Performance Tools Working Group August DC

Qualitative Status Assessment

Measurement/analysis: WIP

Modeling: WIP

Optimization: NC

Interactive/manual: WIP

Automatic: WIP/NC

Data management: WIP

Instrumentation: WIP

Hardware and operating system support: WIP

Visualization/presentation: WIP

Legend

Work in progress (WIP)

No clue (NC)

In hand (IH)

Page 10: Performance Tools Working Group August DC

Performance Tool Ecosystems

10

Page 11: Performance Tools Working Group August DC

Petascale Requirements

Increased automation– anomaly detection

– correlation and clustering

– data reduction

Abstraction support– detail/complexity hiding

Runtime adaptation– task topologies, …

Heterogeneity– programming models: explicit and implicit

– hardware

Hierarchy, including sharing

Page 12: Performance Tools Working Group August DC

Petascale Requirements

Fault tolerance/resilience

Education and training

Multi‐level instrumentation

Memory and I/O analysis

Performability– hybrid/integrated performance and reliability

Presentation and insight– scalable visualization

Performance modeling and prediction

Scaling of known methods and techniques– million‐way parallelism and beyond

Page 13: Performance Tools Working Group August DC

Economic Divergence/Optimization

$/teraflop‐year– declining rapidly

$/developer‐year– rising rapidly

System complexity– rising

Applications outlive systems– by many years

Implications …

Time

Cost

Hardware

People

Page 14: Performance Tools Working Group August DC

Findings

Petascale is not terascale scaled up– higher complexity, heterogeneity

Petascale method inadequacies– manual methods

– purely static and offline approaches

Manual method needs– anomaly detection and optimization

Purely static and offline methods– complement with online, adaptive methods

Page 15: Performance Tools Working Group August DC

Findings

Crucial interactions– users/staff/developers critical

– education and training

– feedback

Insufficient integration– among tools

– component reuse

No general pathway for release quality tools– hardening, documentation, training, support, …

Page 16: Performance Tools Working Group August DC

Recommendations 

# Challenge Type Probability (Risk)

Impact

User engagement and training

Training High High

Additional information sources e.g. I/O, memory

Technical High Medium

Long-term maintenance and support

Funding & policy

High High

Funds for technology transfer and deployment

Funding & policy

Medium Medium

Application-driven development of tools

Medium Medium

Substantial advances in automation of diagnosis, optimization and anomaly detection

Technical High High

Developing live techniques to extend post-mortem

Technical Medium Medium2 August 2007 16

Page 17: Performance Tools Working Group August DC

Recommendations (Continued)

# Challenge Type Probability (Risk)

Impact

Integrated, persistent monitoring components

Technical Medium Medium

Support for multi-component and multi-disciplinary applications

Technical High Medium

Detection of load imbalance Technical High HighSupport for heterogeneous and hierarchical hardware

Technical High High

Support for new and hybrid programming models

Technical, funding, policy & training

Medium Medium

Add performance analysis to CS curriculum

Training & policy

Low Low

2 August 2007 17