Research Computing at Virginia Tech Advanced Research Computing
Feb 25, 2016
Research Computing at Virginia TechAdvanced Research Computing
Advanced Research Computing
Outline
• ARC Overview• ARC Resources• Training & Education• Getting Started
2
Advanced Research Computing 3
ARC OVERVIEW
Terascale Computing Facility
2200 Processor - Apple G5 Cluster 10.28 teraflops; 3 on 2003 Top500 list
Advanced Research Computing
Advanced Research Computing (ARC)
• Unit within the Office of the Vice President of Information Technology– Office of Vice President for Research
• Provide centralized resources for:– Research computing– Visualization
• Staff to assist users• Website: http://www.arc.vt.edu/
Advanced Research Computing
Goals
• Advance the use of computing and visualization in VT research
• Centralize resource acquisition, maintenance, and support for research community– HPC Investment Committee
• Provide support to facilitate usage of resources and minimize barriers to entry
• Enable and participate in research collaborations between departments
Advanced Research Computing
Personnel
• Terry Herdman, Associate VP for Research Computing
• BD Kim, Deputy Director, HPC• Nicholas Polys, Director, Visualization• Computational Scientists
– Justin Krometis– James McClure– Gabriel Mateescu
• User Support GRAs
Advanced Research Computing 8
ARC RESOURCES
Advanced Research Computing
Computational Resources• Blue Ridge – Large scale Linux cluster• Hokie Speed – GPU cluster • Hokie One – SGI UV SMP machine • Athena – Data Analysis and Viz cluster• Ithaca – IBM iDataPlex• Dante – Dell R810• Other resources for individual research groups
Advanced Research Computing
Blue Ridge Large Scale Cluster • Resources for running jobs
– 318 dual-socket nodes with 16 cores/node– socket is an eight-core Intel Sandy Bridge-EP Xeon– 4 GB/core, 64 GB/node– total: 5,088 cores, 20 TB memory
• Two login nodes and two admin nodes – 128 GB/node
• Interconnect: Quad-data-rate (QDR) InfiniBand • Top500 #402 (November 2012)• Requires allocation to run (only ARC system)• Released to users on March 20, 2013
Advanced Research Computing
Allocation System• Like a bank account for system units
– Jobs run are deducted from allocation account• Project PIs (i.e., faculty) request allocation for
research project– Based on research output of project (papers, grants)
and type of computing/software used– Once approved, add other users (faculty, researchers,
students)• Only applies to BlueRidge (no allocation required
to run on other ARC systems)
Advanced Research Computing
HokieSpeed – CPU/GPU Cluster• 206 nodes, each with:
– Two 6-core 2.40-gigahertz Intel Xeon E5645 CPUs and 24 GB of RAM
– Two NVIDIA M2050 Fermi GPUs (448 cores/socket)• Total: 2,472 CPU cores, 412 GPUs, 5 TB of RAM • Top500 #221, Green500 #43 (November 2012)• 14-foot by 4-foot 3D visualization wall• Intended Use: Large-scale GPU computing• Available to NSF Grant Co-PIs
Advanced Research Computing
HokieOne - SGI UV SMP System
• 492 Intel Xeon 7542 (2.66GHz) cores– Two six-way sockets per blade (12 cores/blade)– 41 blades for apps; one blade for system + login
• 2.6TB of Shared Memory (NUMA)– 64 GB/blade, blades connected with NUMAlink
• SUSE Linux 11.1• Recommended Uses:
– Memory-heavy applications– Shared-memory (e.g. OpenMP) applications
Advanced Research Computing
Athena – Data Analytics Cluster
• 42 AMD 2.3GHz Magny Cours quad-socket, octa-core nodes (Total: 1,344 cores, 12.4 TFLOP peak)
• 32 NVIDIA Tesla S2050 (quad-core) GPUs– 6 GB GPU memory
• Memory: 2 GB/core (64 GB/node, 2.7 TB Total)• Quad-data-rate (QDR) InfiniBand• Recommended uses:
– GPU Computations– Visualization– Data intensive applications
Advanced Research Computing
Ithaca – IBM iDataPlex
• 84 dual-socket quad-core Nehalem 2.26 GHz nodes (672 cores in all)– 66 nodes available for general use
• Memory (2 TB Total): – 56 nodes have 24 GB (3 GB/core)– 10 nodes have 48 GB (6 GB/core)
• Quad-data-rate (QDR) InfiniBand• Recommended uses:
– Parallel Matlab– ISV apps needing x86/Linux environment
Advanced Research Computing
Dante (Dell R810)
• 4 octa-socket, octa-core nodes (256 cores in all)
• 64 GB RAM• Intel x86 64-bit, Red Hat Enterprise Linux 5.6• No queuing system• Recommended uses:
– Testing, debugging– Specialty software
Advanced Research Computing
Visualization Resources
• VisCube: 3D immersion environment with three 10 by 10 walls and a floor of ′ ′1920×1920 stereo projection screens
• DeepSix: Six tiled monitors with combined resolution of 7680×3200
• Athena GPUs: Accelerated rendering• ROVR Stereo Wall• AISB Stereo Wall
Advanced Research Computing 18
EDUCATION & TRAINING
Advanced Research Computing
Spring 2013 (Faculty Track)
1. Intro to HPC (13 Feb)2. Research Computing at VT (20 Feb)3. Shared-Memory Prog. in OpenMP (27 Feb)4. Distributed Memory Prog. using MPI (6 Mar)5. Two session courses:
1. Visual Computing (25 Feb, 25 Mar)2. Scientific Programming with Python (1 Apr, 8 Apr)3. GPU Programming (10 Apr, 17 Apr)4. Parallel MATLAB (15 Apr, 22 Apr)
19
Advanced Research Computing
Workshops
• Offered last: January 2013, August 2012• Two days, covering:
– High-performance computing concepts– Introduction to ARC’s resources– Programming in OpenMP and MPI– Third-party libraries– Optimization– Visualization
• Next offered: Summer 2013?
Advanced Research Computing
Other Courses Offered
• Parallel Programming with Intel Cilk Plus (Fall 2012)
• MATLAB Optimization Toolbox (ICAM
Others being considered/in development:• Parallel R
Advanced Research Computing
Graduate Certificate (Proposed)• Certificate Requirements (10 credits)
– 2 core-coursework: developed and taught by ARC computational scientists
• Introduction to Scientific Computing & Visualization (3 credits)• Applied Parallel Computing for Scientists &Engineers (3 credits)
– A selection of existing coursework (3 credits - list provided in proposal draft)– HPC&V seminar (1 credit)– Interdisciplinary coursework (3 credits – optional)
• Administration– Steering/Admissions Committee– Core faculty: develop the courseware and seminar, PhD committee member– Affiliate faculty: instruct existing courses, guest lectures, etc.
Advanced Research Computing
Proposed Core Courses & Content• Introduction to Scientific Computing & Visualization
– Programming environment in HPC– Numerical Analysis– Basic parallel programming with OpenMP and MPI– Visualization tools
• Applied Parallel Computing for Scientists &Engineers– Advanced parallelism– Hybrid programming with MPI/OpenMP– CUDA/MIC programming– Optimization and scalability of large-scale HPC applications– Parallel & remote visualization and data analysis
Advanced Research Computing 24
GETTING STARTED ON ARC’S SYSTEMS
Advanced Research Computing
Getting Started Steps
1. Apply for an account (all users)2. Apply for an allocation (PIs only for projects
wishing to use BlueRidge)3. Log in (SSH) into the system4. System examples
a. Compileb. Submit to scheduler
5. Compile and submit your own programs
Advanced Research Computing
Resources
• ARC Website: http://www.arc.vt.edu• ARC Compute Resources & Documentation:
http://www.arc.vt.edu/resources/hpc/• Allocation System: http://
www.arc.vt.edu/userinfo/allocations.php• New Users Guide: http://
www.arc.vt.edu/userinfo/newusers.php• Training:
http://www.arc.vt.edu/userinfo/training.php
Terry L. HerdmanAssociate Vice President for Research Computing
Director Interdisciplinary Center for Applied MathematicsProfessor Mathematics
Virginia Tech
Research Projects at VT Interdisciplinary Center for Applied mathematics
Founded in 1987 to promote and facilitate interdisciplinary research and education in applied and computational mathematics at Virginia Tech. Currently, ICAM has 45 members from 10 departments, 2 colleges, VBI and ARC. o Named SCHEV Commonwealth Center of Excellence in 1990.o Named DOD Center of Research Excellence & Transition in 1996.o Received more than $25 Million in external funding from federal sources and numerous industrial partners.o Received several MURI and other large center grants. o leader of the VT effort on Energy Efficient Building HUB (EEB)
AGILITY - INGENUITY - INTEGRITYDON’T OVER PROMISEKEEP SCIENTIFIC CREDIBILITY & REPUTATION BUILD EXCELLENT WORKING RELATIONSHIPS WITH INDUSTRY AND NATIONAL LABORATORIES MATHEMATICAL MODELS FOR MANY DIFFERENT PROBLEMS
ICAM History
Other Agencieso NATIONAL SCIENCE FOUNDATION – NSFo NATIONAL AERONAUTICS AND SPACE ADMINISTRATION – NASAo FEDERAL BUREAU OF INVESTIGATION – FBIo DEPARTMENT OF HOMELAND SECURITY – DHSo DEPARTMENT OF ENERGY – DOE EERE, ORNLo NATIONAL INSITUTES OF HEALTH – NIH (ID IQ CONTRACT PROPOSAL)
Industry Funding SourcesAEROSOFT, INC. - BABCOCK & WILCOX - BOEING AEROSPACE - CAMBRIDGE HYDRODYNAMICS - COMMONWEALTH SCIENTIFIC CORP. - HONEYWELL - HARRIS CORP. - LOCKHEED - SAIC - TEKTRONIX - UNITED TECHNOLOGIES - SOTERA DEFENSE SOLUTIONS…
Department of Defenseo AIR FORCE OFFICE OF SCIENTIFIC RESEARCH - AFOSRo DEFENSE ADVANCED RESEARCH PROJECT AGENCY – DARPAo ARMY RESEARCH OFFICE - AROo OFFICE OF NAVAL RESEARCH - ONRo ENVIRONMENTAL TECHNOLOGY DEMONSTRATION & VALIDATION PROGRAM - ESTCPo VARIOUS AIR FORCE RESEARCH LABS – AFRL
Flight Dynamics Lab - Weapons Lab - Munitions Lab
Sources of ICAM’s Funding
Nestles(Ludwigsburg)
Germany
Deutsche Bank(Frankfurt)Germany
Boeing(Seattle)
Tektronix(Beaverton)
Lockheed(Los Angeles)
AeroSoft(Blacksburg)
Babcock & Wilcox(Lynchberg)
Air ForceAFRL
(Albuquerque) Harris Corp.(Melbourne
United Technologies(Hartford)
Sandia(Albuquerque)
NASA(Langley
LLNLDOE Lab
(Livermore)
LBNLDOE Lab(Berkeley)
Air ForceFlight Dynamics
(Dayton)
ORNL(Oak Ridge)
NRELDOE Lab(Golden)
NASA(Ames)
Air ForceAEDC
(Tullahoma)
Air ForceMunitions Lab
(Eglin)
Honeywell(Minneapolis)
SAIC(McLean)
Industry-National Lab Partners
International Collaborations
FACULTY DEPARTMENT COLLEGE
Ball, Joseph A. Mathematics ScienceBaumann, William T. Electrical Engineering EngineeringBeattie, Christopher Mathematics ScienceBorggaard, Jeff Mathematics ScienceBroadwater, Robert Electrical Engineering EngineeringBurns, John A. Mathematics ScienceBall, Ken Mechanical Engineering EngineeringCliff, Eugene M. Aerospace Engineering EngineeringDay, Martin V. Mathematics ScienceRaffaella De Vita Engr. Science & Mechanic EngineeringDiplas, Panayiotis Civil Engineering EngineeringS. Gugercin Mathematics ScienceHagedorn, George A. Mathematics ScienceHerdman, Terry L. Mathematics ScienceIliescu, Traian Mathematics ScienceInman, Daniel J. Mechanical Engineering EngineeringKapania, Rakesh K. Aerospace Engineering EngineeringKim, Jong U. Mathematics ScienceKohler, Werner E. Mathematics ScienceLaubenbacher, Reinhard Bioinformatics Institute VBILin, Tao Mathematics ScienceLindner, Douglas K. Electrical Engineering EngineeringMarathe, Madhav Bioinformatics Institute VBINeu, Wayne L. Aerospace Engineering EngineeringPierson, Mark Mechanical Engineering EngineeringPolys, Nichalos Research Computing Information TechnologyPrather, Carl L. Mathematics SciencePuri, Ishwar Engr. Science and Mechanics EngineeringRenardy, Michael Mathematics ScienceRenardy, Yuriko Mathematics ScienceRibbens, Calvin Computer Science EngineeringRogers, Robert C. Mathematics ScienceRussell, David Mathematics ScienceSachs, Ekkehard Mathematics ScienceSantos, Eunice Computer Science EngineeringShinpaugh, Kevin Research Computing Information TechnologySpanos, Aris Economics ScienceSun, Shu-Ming Mathematics ScienceTyson, John J. Biology ScienceVick, Brian Mechanical Engineering EngineeringWatson, Layne T. Computer Science EngineeringWheeler, Robert L. Mathematics ScienceWilliams, Michael Mathematics ScienceL. Zietsman Mathematics Science
o 10 Academic Departmentso 2 Collegeso VBIo ARC - IT
FACULTY DEPARTMENT COLLEGE/INSTITUTE
J. T. Borggaard Mathematics Science
J. A. Burns Mathematics Science
E. M. Cliff Aerospace & Ocean Engr. Engineering
T. L. Herdman Mathematics Science
S. Gugercin Mathematics Science
T. Iliescu Mathematics Science
D. J. Inman Mechanical Engineering Engineering
Reinhard Laubenbacher Discrete Modeling VBI
Madhav Marathe Simulation VBI
Henning Mortveit Simulation VBI
Nicholas Polys Visualization ARC- IT
Kevin Shinpaugh HPC ARC - IT
L. Zietsman Mathematics Science
2010 - 2011 CORE MEMBERS*
* DEPENDS ON CURRENT PROJECTS & FUNDING
1 staff person: Misty Bland
ICAM Team
CURRENT ASSOCIATE MEMBERS
Homeland Security
Nano Technology Energy Efficient Buildings
Design of Jets
HPC - CS & E
Space Platforms
H1N1
CANCER HIV
IMMUNE
Life Sciences
Advanced Control
ICAM History of Interdisciplinary Projects
Good News / Bad News
Good News Every IBG Science Problem has a Mathematics
ComponentBad News
No IBG Science Problem has only a Mathematics Component
W.R. Pulleyblank
Director, Deep Computing InstituteDirector, Exploratory Server SystemsIBM Research
Two Applications to Aerospace
Past Application / New ApplicationAirfoil Flutter
New ApplicationNext Generation Large Space Systems
ICAM - Interdisciplinary CenterForAppliedMathematics
FEEDBACK CONTROL OF FLUID/STRUCTURE INTERACTIONS
Stealth
• Began as an unclassified project at DARPA in the early ’70’s
• Proved that physically large objects could still have miniscule RCS (radar cross section)
• Challenge was to make it fly!
1993 - 1997 USAF - $2.76 MOptimal Design And Control of Nonlinear Distributed Parameter SystemsUniversity Research Initiative Center Grant
MURITEAM
VT - ICAM Boeing USAFNC STATE Lockheed
F – 117A
09/14/97: F-117A CRASH CAUSED BY
FLUTTER
MURI TOPIC: CONTROL OF AIR FLOWS
X - 29
DARPA ALSO PROVIDED FUNDS FOR THE RENOVATION OF WRIGHT HOUSE – ICAM’s HOME SINCE 1989
1987 - 1991 DARPA - $1.4 MAn Integrated Research Program for the Modeling, Analysis and Control of Aerospace Systems
TEAMVT- ICAM NASA USAF
ICAM History of Interdisciplinary Projects
Mathematical Researchmotivated by problems of interest to
industry, business, and government organizations as well as the science and engineering communities.
Mathematical framework: both theoretical and computational
Projects require expertise in several disciplines
Projects require HPCProjects require Computational Science:
Modeling, analysis, algorithm development, optimization, visualization.
University Research TeamJohn BurnsDennis BrewerHerman BrunnerGene CliffYanzhao CaoHarlan StechJanos TuriDan InmanKazifumi Ito
Graciela CerezoElena FernandezBrian FultonZ. LiuHoan NguyenDiana RubioRicardo Sanchez Pena8 Undergraduate
Students 10 Graduate Students
Research Support and PartnersAFOSRDARPA ACM and SPONASA- LaRCNIAFlight Dynamics Lab, WPAFBLockheed Martin
Build Math Model• start simple • use and keep the Physics (Science)• use and keep Engineering Principles• do not try to be an expert in all associated
disciplines – interdisciplinary team• learn enough so that you can communicate • know the literature• computational/experimental validation
Spring Mass System
h(t) plunge
β(t) Flap Angleα(t) Pitch Angle
Pitching, Plunging and Flap Motions of Airfoil
T
T
tMtMtLtF
ttthtz
tFm
tKztzBtzM
)](),(),([)(
)](),(),([)(
)(1)()()(
Force: Lift
)(
)()(2)(
0 2
tztz
CdtUUdtdtL
Note: Lift depends on past history
Evolution Equation for Airfoil Circulation:
),[),()(,)0(2)(
)(),(112)()(
),(),()(
)1()(1),(),(1
1
1
021
0
1
1
1
1 0
osssUs
Ussk
dssdsstssdsstsk
functionknownxtdxxtt
dUx
txtdxxt
a
aa
aa
Mathematical Model
change 2nd order ODE to 1st order systemcouple ODE with evolution equation past history of circulation function provides
part of the initial conditions
Complete Mathematical Model
A is a singular 8 by 8 matrix : last row zerosA(s) : A8i=0 i=1,2,…,7A88(s)=[(Us-2)/Us]1/2, U constantB constant matrix, B(s) is smoothNon Atomic Neutral Functional Differential
Equation
Tt
t
tttthttthtx
dsstxsBtBx
dsstxsAtAxdtd
)]),(),(),(),(),(),(),([)(
)()()(
])()()([0
Non Atomic NFDENeed Theory of Non Atomic NFDE
Well Posedness resultsApproximation TechniquesParameter Identification Validation of the Model
Abstract Cauchy Problem
z from ) (or x x Capture T(t)zz(t)
T(t) semigroup C a ofIFG an isA zz(0) )()(z
xdtd
t
0
0
0
0
tAztACP
LxDxNFDE tt
ISATInnovative Space Based Radar Antenna Technology
Antenna surface
Inflatable booms
Canister
•300 m long truss structure, 1000 m2 antenna• fly in 2009•Launched in container the size of a small SUV
Next Generation Space Systems• develop and deploy large space
antennas • take advantage of new materials• take advantage of inflatable technology• joint effort DARPA, NASA LaRC, NIA and
Virginia Tech ICAM, Boeing, Lockheed Martin, JPL, Harris Corp., AFRL and others
• ICAM – build physics based mathematical models for simulation and control (after deployment)
• NASA/AFRL – experiments, testing, development, packaging and deployment
Build Math Model
• Dr. Joe Guerci, SPO, DARPA• Remember: Obey all Physics
(Science) laws• need experts in all associated
disciplines – interdisciplinary team• Must communication across
disciplines and organizations• know the literature• computational/experimental
validation
New Mathematical Models
2
2 2
02 2 3( , ) [ ( , ) ( ) ( , ) ]
2 2y t x EI y t x s y t s x ds
t x x x t
Including Thermal Effects Changes Everything
02
2 3( , ) ( , ) ( , ) ( , )2t x t x y t x f t x
t x x t
( , )x t x ADD THERMAL
EQUATIONS ( ) ( )b x u t
MORE ACCURATE – MORE COMPLEX – MORE DIFFICULT
Necessary Components for Success
• research expertise in many areas – interdisciplinary team
• experience (knowledge of what may work)• MATHEMATICS• external support • state of the art computing facilities• GRAs and young research faculty (new ideas)