2004/12/22 国国国国国国国国 国国国 国国国国 国国国国 ,,, 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years 大大 大 Wataru Ohfuchi [email protected]Earth Simulator Center Japan Agency for Marine-Earth Science and Technol ogy and Atmospheric and Oceanic Simulation Group and AFES Working Team
30
Embed
2004/12/22 1 Brief Outline of the Earth Simulator and Our Research Activities AND A Lesson Learnt for the Past Three Years Wataru Ohfuchi [email protected].
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 1
Brief Outline of the Earth Simulatorand Our Research Activities
– Adopted from CCSR/NIES AGCM5.4.02• Center for Climate System Research, the Univ. Tokyo• Japanese National Institute for Environmental Studies• Rewritten totally from scratch with FORTRAN 90, MPI and microtasking
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 13
T1279L96 AFES’s Scalability
26.58Tflops64.9%
14.50Tflops70.8%
7.61Tflops74.3%3.86Tflops
75.3%0
5
10
15
20
25
30
35
40
0 640 1280 1920 2560 3200 3840 4480 5120
Tflops
Peak Sustained
#CPU
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 14
AFES won the Gordon Bell Awardfor Peak Performance!!!
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 15
Meso-scale ResolvingT1279L96 Simulations
• Typhoons, wintertime cyclogenesis and Baiu-Meiyu front
– Interactions between large-scale circulations and meso-scale phenomena
– Self-organization of meso-scale circulations in larger circulation field
• Short-term (10 days to 2 weeks)– CPU power is NOT a problem; data size (~Tera bytes) is the
Summary• With the combination of the ES and models well
optimized for its architecture, now it is possible to conduct meaningful (ultra-)high resolution global simulations first in the history of computational atmospheric and oceanic sciences, and geophysical fluid dynamics.
• Interaction between meso-scale phenomena and larger-scale circulation can be studied.
• Scientifically new knowledge and contribution to society are expected.
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 23
A Possible Future Direction ofHigh Performance Computing in
Atmospheric and Oceanic Sciences: A Lesson Learnt for the Past Three Years
with the Earth Simulator
• The Earth Simulator was not unfortunately perfect, of course.
• What I foresee as future modeling strategy.• What I foresee as future HPC in AOS.• It will be published in “Advances in Science: Earth
Science” edited by Prof. Peter Sammonds, Royal Society’s Philosophical Transactions (2005).
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 24
How Many Points Are There in the 10-km Mesh AGCM?
• T1269L96– 1279 spherical harmonics with the so-called “trian
gular trancation.– 3840 (longitude) X 1920 (latitude) X 96 (layers) =
…– ~700 M points…
• Assume Double Precision (8B) and 100 Variables…– ~560 GB– Actually, the T1279L96 AFES needs about 1.2 TB
of memory.
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 25
How Much Data Are We Producingwith the 10-km Mesh AGCM?
• One 3-D Snapshot.– 2.6GB.
• Oh, We Need 6-hourly Output!!! Ten 3-D Variables!!! For One Day…– 2.6GB X 4/day (6-hourly) X 10 variables = …– 104GB/day.
• Oh, We Want to Integrate for 10 Days…– ~1TB.
• Oh, We Are Climatologists!!! We Want to Integrate for 10000 Days…– ~1PB.
• Oh, We Need Ten Sensitivity Tests!!!– 10PB.
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 26
Future HPC in the World (in 2010…)• VERY Unfortunately HPC Hardware Business Is Cu
rrently Dominated by the IMPERIAL JAPAN and the US of A!!!
• Suppose Emerging Super Computing Country, Taiwan, Republic of China, Takes Over Submerging J and USA within a Few Years.
• The Earth System Simulator, the National Taiwan Normal University.– 1 PFlops machine (25 larger than the ES).– 1 Exabyte of hardisk/PROJECT!!!– 1 Zettabyte of long-term storage/PROJECT!!!
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 27
So, What We Can Do with The Earth System Simulator
at the National Taiwan Normal University 2010?
• When You Increase the “Resolution” by the Factor of Two…– 2 (longitutde) X 2 (latitude) X 2 (vertical levels) X 2 (time)
= 16 – ~ 25.
• The Current Biggest Global Atmospheric Simulation Project on the ES is ~3-km Mesh (Nonhydorstatic) .– So, ~1.5-km mesh simulation.– Sorry, it’s not “cloud resolving, yet!!!
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 28
We Need to Think Better Than That!!!• Multi-scale Modeling.• “Stand-alone” Global “Hydrostatic” Model.• “Stand-alone” Regional Nonhydrostatic Model.
– As “super-parameterization”.
• “Stand-alone” 3-D Turbulence Model.• “Super-prameterization”-like link between these
models.• We may have to go down to “explicit” cloud physic
s.• Of course, 3-D radiation!!!
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 29
Conclusions 1
• HPC is not only a number crunching capability.– Linpack has become totally obsolete.
• Data handling is much much much much much much….MORE important. We are already in the middle of the storm of data!– Hard disk.– Long-term data storage.– Software.
• But still we need to think much better.– Just increasing resolution does not seem to lead to
a breakthrough.
2004/12/22 国立台湾師範大学,台北,台湾省,中華民国 30
Conclusions 2• A HUGE HPC System should be used as a whole.
– The ES consists of 640 nodes.– Sorry, those jobs that require less than ~320 nodes should go awa
y.• “Expensive” vs. “Cheapo”
– Vector vs. Scalar?– We need to think about “cost effectiveness”.– It may depend on problems.
• GRID?– Probably very good for data sharing.– Simulations?
• We need to integrate science and engineering.– At least wee need to understand both and have “strong” opinions.