Top Banner
Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter: Javier Munoz Agnostic: Ana Rodriguez Gabrielle Allen, Greg Daues, Jason Novotny, Edward Seidel
49

Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Dec 13, 2015

Download

Documents

Silas Malone
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Community Software Development with the Astrophysics Simulation Collaboratory

Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf,

Presenter: Javier Munoz Agnostic: Ana Rodriguez

Gabrielle Allen, Greg Daues, Jason Novotny, Edward Seidel

Page 2: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Outline Introduce the ASC

Cactus: Architecture

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 3: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Outline Introduce the ASC

Cactus: Architecture

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 4: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

NSF Award Abstract -#9979985KDI: An Astrophysics Simulation Collaboratory: Enabling Large Scale Simulations in

Relativistic Astrophysics

NSF Org PHY Latest Amendment Date September 22, 2003 Award Number 9979985 Award Instrument Standard Grant Program Manager Beverly K. BergerPHY DIVISION OF PHYSICSMPS DIRECT FOR MATHEMATICAL & PHYSICAL SCIEN Start Date September 15, 1999 Expires August 31, 2004 (Estimated) Expected

Total Amount $2,200,000.00 (Estimated) Investigator

Wai-Mo Suen [email protected] (Principal Investigator current)Ian Foster (Co-Principal Investigator current)Edward Seidel (Co-Principal Investigator current)Michael L. Norman (Co-Principal Investigator current)Manish Parashar (Co-Principal Investigator current)

Sponsor Washington UniversityONE BROOKINGS DRIVE, CAMPUS BOXSAINT LOUIS, MO 631304899 314/889-5100 NSF Program 8877 KDI-COMPETITION

Page 5: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Astrophysics Simulation Collaboratory (ASC)

Astrophysics General Theory of Relativity

Simulation Numerical solution of Partial

Differential Equations

Collaboratory Infrastructure to support

efforts to solve large complex problems by geographically distributed participants.

Page 6: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Tired of Middleware?

The ASC is a complete Application

BUT we’ll talk middleware anyway…

Page 7: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Focus

Application (70%)

(30%)Middleware

Page 8: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

ASC: Purpose Community (VO) Domain-specific

components Transparent access Deployment services Collaboration during

execution Steering of simulations Multimedia streams

Page 9: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

ASC: Technologies Used Cactus Framework

Application Server

Grid Tools

Page 10: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Cactus Simulations in the ASCportal (Agnostic 4)

www.ascportal.org

Page 11: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Outline Introduce the ASC

Cactus Architecture (Agnostic 1)

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 12: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Cactus 1995 Original version:

Paul Walker Joan Masso Edward Seidel John Shalf

1999 Cactus 4.0 Beta 1 Tom Goodale Joan Masso Gabrielle Allen Gerd Lanfermann John Shalf.

Page 13: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Why Cactus?

Parallelization Model Easy to Grid-enable Flexible C and Fortran

Page 14: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Cactus

Modularity New Equations (physics) Efficient PDE solution (Numerical Analyst) Improved distributed algorithm (CS)

Page 15: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Cactus Building Blocks:

Schedule Driver Flesh Thorns Arrangements Toolkit

www.cactuscode.org

Page 16: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Scheduling (workflow) Flesh invokes a

driver to process the schedule

ww.cactuscode.org

Page 17: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Driver Parallelizes the execution Management of grid variables

storage distribution communication

Distributed memory model.  section of the global grid, Boundaries: Physical or Internal

Each thorn is presented with a standard interface, independent of the driver.

Page 18: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Driver PUGH (Parallel Unigrid Grid Hierarchy)

MPI Uniform mesh spacing Non-adaptive

Automatic grid decomposition Manual decomposition

Number of processors in each direction Number of grid points on each processor

Page 19: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Flesh

In general, thorns overload or register their capabilities with the Flesh, agreeing to provide a function with the correct interface

Page 20: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Thorn: Anatomy Cactus Configuration

Language Parameter Files (Input) Configuration File

Interface Schedule

Application Code C Fortran

Miscellaneous Documentation Make

ww.cactuscode.org

Page 21: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Thorn: Files

Param.ccl: What are the parameters for my thorn? What are their ranges? Are they steerable? What parameters do I need from other thorns? Which of my parameters should be available for other thorns?

Interface.ccl: What does my thorn “do” What are my thorns grid variables? What variables do I need from other thorns? What variables am I going to make available to other thorns? Timelevels Ghostzones

Schedule.ccl: When and how should my thorns routines be run? How do my routines fit in with routines from other thorns? Which variables should be synchronized on exit?

Page 22: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Objectives Introduce the ASC

Cactus: Architecture

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 23: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Finite Differencing: Infinitesimal vs Small Delta

http://homepage.univie.ac.at/franz.vesely/cp_tut/nol2h/applets/HO.htmlhttp://homepage.univie.ac.at/franz.vesely/cp_tut/nol2h/applets/HO.html

Page 24: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Finite Differencing: Ghost Zones

The grid on each processor has an extra layer of grid-points (in blue) which are copies of those on the neighbor in that direction

After the calculation step the processors exchange these ghost zones to synchronize their states.

Cactus 4.0 User’s Guide

Page 25: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Finite Differencing: Time Levels

Similar to Ghost Zones for the time dimension

Cactus managed leads to optimization Numerical differentiation algorithm

dependent Typically three

Page 26: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Finite Differencing: Synchronization

Cost of parallelization Network characteristics important Transfer of 12MBytes per iteration

BUT…there is room for optimization

Page 27: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Objectives Introduce the ASC

Cactus: Architecture

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 28: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Cactus at Work Members of the Cactus and Globus projects after

winning a Gordon Bell Prizes in high-performance computing for the work described in their paper: Supporting Efficient Execution in Heterogeneous Distributed Computing Environments with Cactus and Globus

Page 29: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

What did they do?

Scaled Out Grid enabled four supercomputers 249 GFlops

Efficiently Scaling efficiency:

88% with 1140 CPU’s 63% with 1500 CPU’s

Page 30: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Scaling Out Finite Differencing Solutions to PDE’s

Problem: Nodes with different types of Processors,

Memory sizes Heterogeneous Communications among

processors. Multiprocessors LAN WAN Bandwidth, TCP, and Latency

Page 31: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Scaling Out in a Computational Grid

Strategies Irregular data distribution Grid-aware communication schedules Redundant computation Protocol tuning

Page 32: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Where are they now? (Agnostic 5,8)

Gabrielle AllenAssistant Director for Computing Applications, Center for Computation & Technology

Associate Professor, Department of Computer Science Louisiana State University Edward Seidel

Director, Center for Computation & TechnologyFloating Systems Professor, Departments of Physics and Computer ScienceLouisiana State University

Visualization of Katrina developed at CCTApplication Frameworks for High Performance andGrid Computing, G. Allen, E. Seidel, 2006. http://www.cct.lsu.edu/~gallen/Preprints/CS_Allen06b.pre.pdf

Page 33: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Outline Introduce the ASC

Cactus: Architecture

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 34: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

GridSphere

1st Grid Middleware Congresswww.GridSphere.org

Page 35: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

GridSphere (Agnostic 3)

Developed by the EU GridLab project About 100,000 lines of code Version 2.0

Framework based on: Grid Portal Development Kit (GPDK) ASC Web Portal

Open source project http://www.gridsphere.org

Framework for portlet development Portlet Container

Page 36: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Ongoing Collaborations (Agnostic 10)

Cactus portal at Albert Einstein Institute Interface to Cactus numerical relativity application /

provide physicists with interface for launching jobs & viewing results

Grid Portal at Canadian National Research Council Provide controlled remote access to NMR spectroscopy

instruments GEON earth sciences portal / CHRONOS portal

Manage/visualize/analyze vast amount of geosciences data and large scale databases

Pgrade portal at SZTAKI Hungary & U. Westminster UK Creation, execution and monitoring of complex workflows

Page 37: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

GridSphere Core Portlets

Login/Logout Role Based Access Control (RBAC) separating users into

guests, users, admins, and super users Account Request Account Management User Management Portlet Subscription Local File Manager Notepad Text Messaging

Page 38: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Action Portlets

Hides branching logic Action and view methods to invoke for

events Provides default actionPerformed

Page 39: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Personalized Environment

"GridSphere’s Grid Portlets A Grid Portal Development Framework" Jason NovotnyGridSphere and Portlets workshop, March 2005, e-Science Institute

Page 40: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Single Sign-On Capabilities

"GridSphere’s Grid Portlets A Grid Portal Development Framework" Jason NovotnyGridSphere and Portlets workshop, March 2005, e-Science Institute

Page 41: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Perform File Transfers (Agnostic 2)

"GridSphere’s Grid Portlets A Grid Portal Development Framework" Jason NovotnyGridSphere and Portlets workshop, March 2005, e-Science Institute

Page 42: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

GridSphere

"GridSphere’s Grid Portlets A Grid Portal Development Framework" Jason NovotnyGridSphere and Portlets workshop, March 2005, e-Science Institute

Page 43: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Submit Jobs

"GridSphere’s Grid Portlets A Grid Portal Development Framework" Jason NovotnyGridSphere and Portlets workshop, March 2005, e-Science Institute

Page 44: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

GridSphere

https://portal.cct.lsu.edu/gridsphere/gridsphere?cid=home

Page 45: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Outline Introduce the ASC

Cactus: Architecture

Cactus: Math

Cactus: Scaling Out

Gridsphere

Agnostic Questions

Page 46: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Agnostic Questions6. The ASC application server uses a relational database to

maintain the state of sessions; could it be implemented in any other way? Explain.

Sure, SQL is a de-facto standard; The Gridsphere Container provides a Persistence Manager that uses open-source Castor libraries from Exolab which provides mechanisms for mapping objects to SQL and an object query language (OQL)

Using Castor, mappings from Java to SQL can be generated automatically

Page 47: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Agnostic Questions7. Can you expand on the MDS browser developed by the

Java CoG Kit?

Currently MDS4 uses XPATH instead of LDAP for query language. Registration is performed via a Web service Container built-in MDS-Index service Aggregator services are dynamic A Service can become Grid wide service index.

Source: http://globus.org/toolkit/docs/4.0/key/GT4_Primer_0.6.pdf

Page 48: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Agnostic Questions9. Could the collaboratory framework be implemented using

technologies other than Java? If so, could it still be used in the same way? What would be the pros and cons of using Java technologies vs. other alternative technologies?

Cactus uses C and FORTRAN, but Web Portals are mainly being developed using Java:

GridPort GCE-RG GPDK GridSphere

Page 49: Community Software Development with the Astrophysics Simulation Collaboratory Authors: Gregor von Laszewski, Michael Russell, Ian Foster, John Shalf, Presenter:

Questions?

Fallacy? “Discipline scientists are typically not

experts in distributed computing.”

Cactus Developers have a background in Mathematics and Physics.