e-Scale Molecular Dynamics Simulati of Materials on Parallel Computers CCLMS CCLMS Aiichiro Nakano & Priya Vashishta current Computing Laboratory for Materials Simulati Department of Computer Science Department of Physics & Astronomy Louisiana State University il: [email protected]URL: www.cclms.lsu. VII International Workshop on Advanced Computing & Analysis Techniques in Physics Research Organizers: Dr. Pushpalatha Bhat & Dr. Matthias Kasemann October 19, 2000, Fermilab, IL
39
Embed
Large-Scale Molecular Dynamics Simulations of Materials on Parallel Computers
CCLMS. Large-Scale Molecular Dynamics Simulations of Materials on Parallel Computers. Aiichiro Nakano & Priya Vashishta Concurrent Computing Laboratory for Materials Simulations Department of Computer Science Department of Physics & Astronomy Louisiana State University - PowerPoint PPT Presentation
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Large-Scale Molecular Dynamics Simulations of Materials on Parallel Computers
CCLMSCCLMSCCLMSCCLMS
Aiichiro Nakano & Priya VashishtaConcurrent Computing Laboratory for Materials Simulations
Department of Computer ScienceDepartment of Physics & Astronomy
Louisiana State UniversityEmail: [email protected] URL: www.cclms.lsu.edu
VII International Workshop on Advanced Computing & Analysis Techniques in Physics Research
Organizers:Dr. Pushpalatha Bhat & Dr. Matthias Kasemann
October 19, 2000, Fermilab, IL
Outline
1. Scalable atomistic-simulation algorithms
2. Multidisciplinary hybrid-simulation algorithms
3. Large-scale atomistic simulation of nanosystems
> Nanophase & nanocomposite materials
> Nanoindentation & nano-impact damage
> Epitaxial & colloidal quantum dots
4. Ongoing projects
Concurrent Computing Laboratory for Materials Simulations
Postdocs/research faculty: Martina Bachlechner, Tim Campbell, Hideaki Kikuchi, Sanjay Kodiyalam, Elefterios Lidorikis, Fuyuki Shimojo,Laurent Van Brutzel, Phillip Walsh
Ph.D. Students: Gurcan Aral, Paulo Branicio, Jabari Lee, Xinlian Liu, Brent Neal, Cindy Rountree, Xiaotao Su, Satavani Vemparala, Troy Williams
Visitors: Elisabeth Bouchaud (ONERA), Antonio da Silva (São Paulo),Simon de Leeuw (Delft), Ingvar Ebbsjö (Uppsala), Hiroshi Iyetomi (Niigata), Shuji Ogata (Yamaguchi), Jose Rino (São Carlos)
• Ph.D. in physics & MS from computer science in 5 years —Broad career options (APS News, August/September, ‘97)
• Synergism between HPCC (MS) & application (Ph.D.) research—Best dissertation award (Andrey Omeltchenko, ‘97)
• Graph-theoretical data mining of topological defects
2. Multidisciplinary Hybrid-Simulation Algorithms
CCLMSCCLMSCCLMSCCLMS
Multiscale SimulationLifetime prediction of safety-critical
micro-electro-mechanical systems (MEMS)
• Engineering mechanics experimentally validated > 1 m• Atomistic simulation possible < 0.1 m
[R. Ritchie, Berkeley]
Bridging the length-scale gap by seamlessly coupling:• Finite-element (FE) calculation based on elasticity;• Atomistic molecular-dynamics (MD) simulation;• Ab initio quantum-mechanical (QM) calculation.
Hybrid QM/MD Algorithm
QM
MD
Handshakeatoms
Additive hybridizationReuse of existing QM & MD codes
Handshake atomsSeamless coupling ofQM & MD systems
MD simulation embeds a QM cluster described by a real-space multigrid-based density functional theory
Hybrid MD/FE Algorithm• FE nodes & MD atoms coincide in the handshake region• Additive hybridization
MD
FE
[0 1 1]
[1 1 1]_
HS
_[1 1 1]
[2 1 1]
Oxidation on Si Surface
Dissociation energy of O2 on a Si (111) surface dissipated seamlessly from the QM cluster through the MD regionto the FE region
QM cluster
MD FE
QM O
QM Si Handshake H
MD Si
3. Large-Scale Atomistic Simulation of Nanosystems
CCLMSCCLMSCCLMSCCLMS
Fracture Simulation & Experiment
Microcrackcoalescence
Multiplebranching
Si3N4Ti3Al alloyE. Bouchaud
Graphite GlassK. Ravi-Chandar
Good agreement with experimentsPlane Gc (MD) Gc (expt.)(110) 1.4 ± 0.1 1.72*
1.52#
Fracture Energy of GaAs: 100-million-atom MD Simulation
256 Cray T3E processors at DoD’s NAVO-MSRC
1.3 m
-0.8 0 0.8 Shear stress (GPa)
*Messmer (‘81) #Michot (‘88)
Color code: Si3N4; SiC; SiO2
Si3N4-SiC Fiber Nanocomposite
Fracture surfaces in ceramic-fiber nanocomposites:Toughening mechanisms?
1.5-billion-atom MD on 1,280 IMB SP3 processors at NAVO-MSRC
0.3 m
0
Pressure (GPa)
-5 -2 2 >20105
Nanoindentation on Silicon Nitride Surface
Use Atomic Force Microscope (AFM) tipfor nanomechanical testing of hardness
Highly compressive/tensile local stresses
10 million atom MDat ERDC-MSRC
Indentation Fracture & Amorphization
<1210> Indentation fractureat indenter diagonals
Amorphous pile-upat indenter edges
Anisotropicfracture toughness
<1010>
<0001>
Hypervelocity Impact Damage Design of damage-tolerant spacecraft
Impact graphitization
Diamond impactor
Impact velocity: 8 - 15 km/s
Diamond coating
QuickTime™ and aVideo decompressor
are needed to see this picture.Meteoroid detector onMir Orbitor
Reactive bond-order potential (Brenner, ‘90)
V = 8 km/s
V = 15 km/s
V = 11 km/s
Impact-Velocity Sensitivity
Crossover from quasi-elastic to evaporation at ~ 10 km/s
time
-
Epitaxially Grown Quantum Dots
A. Madhukar (USC)
Substrate-encoded size-reducing epitaxy
GaAs (001) substrate; <100> square mesas
10nm
101
GaAsAlGaAsQDQD
001
AlGaAs
70 nm
Stress Domains in Si3N4/Si Nanopixels
Stress domains in Sidue to an amorphousSi3N4 film
-2GPa 2GPa
27 million atom MD simulation
Stress well in Si with a crystalline Si3N4 film due to lattice mismatch
Si
Si3N4
Colloidal Semiconductor Quantum Dots
17.5 GPa
Multiple domains
Applications
• LED, display
• Pressure synthesis of novel materials
High-pressure structural transformationin a GaAs nanocrystal
22.5 GPa
30 Å
Nucleation at surface
Oxide Growth in an Al Nanoparticle
Oxide thickness saturates at 40 Å after 0.5 ns—Excellent agreement with experiments
Unique metal/ceramic nanocomposite
70 Å 110 Å
Al AlOx
4. Ongoing Projects
CCLMSCCLMSCCLMSCCLMS
Information Grid
Metacomputing collaboration with DoD MSRCs:4-billion-atom MD simulation of 0.35 m fiber composites
http://www.nas.nasa.gov/aboutNAS/Future2.html
Universal access to networked supercomputing
I. Foster & C. Kesselman, The Grid: Blueprint for a New Computating Infrastructure (‘99)
MD Moore’s LawNumber of atoms in MD simulations has doubled:• Every 19 months in the past 36 years for classical MD• Every 13 months in the past 15 years for DFT-MD
A petaflop computer will enable 1012-atom MD & 107-atom QM
CDC3600
1,280 x IBM SP3
QM
FE
MD
Si3N4 AFM Tip
Hybrid Simulation of Functionalized AFMNanodevices to design new biomolecules
Biological Computation &Visualization Center, LSU($3.9M, 2000- )
Large-scale, multiscale simulations ofrealistic nanoscale systems will be possible
in a metacomputing environmentof the Information Grid
Conclusion
Research supported
by
NSF, AFOSR, ARO, USC/LSU MURI, DOE, NASA, DOD Challenge Applications Award