Australian Partnership for Advanced Computing “providing advanced computing and grid infrastructure for eResearch” Rhys Francis Manager, APAC grid program Partners: • Australian Centre for Advanced Computing and Communications (ac3) in NSW • The Australian National University (ANU) • Commonwealth Scientific and Industrial Research Organisation (CSIRO) • Interactive Virtual Environments Centre (iVEC) in WA • Queensland Parallel Supercomputing Foundation (QPSF) • South Australian Partnership for Advanced Computing (SAPAC) • The University of Tasmania (TPAC) • Victorian Partnership for Advanced Computing (VPAC)
32
Embed
Australian Partnership for Advanced Computing “providing advanced computing and grid infrastructure for eResearch” Rhys Francis Manager, APAC grid program.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Australian Partnership forAdvanced Computing
“providing advanced computing andgrid infrastructure for eResearch”
Rhys FrancisManager, APAC grid program
Partners:• Australian Centre for Advanced Computing and
Communications (ac3) in NSW• The Australian National University (ANU)• Commonwealth Scientific and Industrial Research
Organisation (CSIRO)• Interactive Virtual Environments Centre (iVEC) in WA• Queensland Parallel Supercomputing Foundation (QPSF)• South Australian Partnership for Advanced Computing
(SAPAC) • The University of Tasmania (TPAC)• Victorian Partnership for Advanced Computing (VPAC)
• National Facility Program – a world-class advanced computing service– currently 232 projects and 659 users (27 universities)– major upgrade in capability (1650 processor Altix 3700 system)
• APAC Grid Program– integrate the National Facility and Partner Facilities– allow users easier access to the facilities– provide an infrastructure for Australian eResearch
• Education, Outreach and Training Program– increase skills to use advanced computing and grid systems– courseware project– outreach activities – national and international activities
APAC Programs
Engineering Taskforce
Implementation Taskforce
Project Leader
Research Leader
Steering Committee
Activities
APAC Grid Development
APAC Grid Operation
Research Activities Development Activities
Project Leader
Activities
140 people
>50 full time equivs
$8M pa in people
Plus compute/dataresources
Projects
Grid Infrastructure Computing Infrastructure
• Globus middleware• certificate authority• system monitoring and management
(grid operation centre)
Information Infrastructure• resource broker (SRB)• metadata management support
• portals to application software• workflow engines• visualisation tools
Grid Applications
Astronomy
High-Energy Physics
Bioinformatics
Computational Chemistry
Geosciences
Earth Systems Science
Organisation Chart
Program Manager Rhys Francis
Project Leader S/C Chair Astronomy Gravity Wave Susan Scott Rachael Webster Astrophysics portal Matthew Bailes Rachael Webster Australian Virtual Observatory Katherine Manson Rachael Webster Genome annotation Matthew Bellgard Mark Ragan Molecular docking Rajkumar Buyya Mark Ragan Chemistry workflow Andrey Bliznyuk Brian Yates Earth Systems Science workflow Glenn Hyland Andy Pitman Geosciences workflow Robert Woodcock Scott McTaggart EarthBytes Dietmar Muller Scott McTaggart Experimental high energy physics Glenn Moloney Tony Williams Theoretical high energy physics Paul Coddington Tony Williams Remote instrument management Chris Willing Bernard Pailthorpe
Project Leader Services Gateway VM Compute Infrastructure David Bannon CA NG1, NG2 VOMS/VOMRS Gram2/4 Information Infrastructure Ben Evans SRB NGdata GridFTP MDS2/4 UI&VI Rajesh Chabbra Gridsphere NGportal Myproxy Collaboration Services Chris Willing A/G
APAC Executive Director John O’Callaghan
Name Partner Name Partner Youzhen Cheng ac3 David Baldwyn ANU Bob Smart CSIRO Darran Carey iVEC Martin Nicholls QPSF/UQ Grant Ward SAPAC John Dalton TPAC Chris Samuel VPAC
Associated grid nodes David Green QPSF/Griffith Ian Atkinson QPSF/JCU Ashley Wright QPSF/QUT Marco La Rosa UoM
Gateway Servers Support Team David Bannon
Services Architect Markus Buchhorn
LCG VM Marco La Rosa
Infrastructure Support (Middleware)
Application Support
Infrastructure Support (Systems)
Strategic Management
Middleware Deployment
Research Applications
Systems Management
Experimental High Energy Physics
• Belle Physics Collaboration– K.E.K. B-factory detector
– all components and materials : 10x10x20 m, 100 µm accuracy
– tracking and energy deposition through all components
– all electronics effects (signal shapes, thresholds, noise, cross-talk)
– data acquisition system (DAQ)
• Need 3 times as many simulations as real events to reduce statistical fluctuations
Belle status
• Apparatus at KEK in Japan
• Simulation work done world wide
• Shared using an SRB federation: KEK, ANU, VPAC,
Korea, Taiwan, Krakow, Beijing…(led by Australia!)
• Previous research work used script based workflow
control, project is currently evaluating LCG middleware for
workflow management
• Testing in progress: LCG job management, APAC grid job
execution (2 sites), APAC grid SRB data management (2
sites) with data flow using international SRB federations
• Limitation is international networking
Earth Systems Science Workflow
• Access to Data Products– Inter-governmental Panel Climate
Change scenarios of future climate (3TB)
– Ocean Colour Products of Australasian and Antarctic region (10TB)
– 1/8 degree ocean simulations (4TB)– Weather research products (4TB)– Earth Systems Simulations– Terrestrial Land Surface Data
• Grid Services– Globus based version of OPeNDAP (UCAR/NCAR/URI)– Server side analysis tools for data sets: GRADS, NOMADS– Client side visualisation from on-line servers– THREDDS (catalogues of OPeNDAP repositories)
VDT Condor configuration scriptDOE and LCG CA Certificates, vv4 (includes LCG 0.25 CAs) DRM, v1.2.9 EDG CRL Update, v1.2.5 EDG Make Gridmap, v2.1.0 Fault Tolerant Shell (ftsh), v2.0.12 Generic Information Provider, v1.2 (2004-05-18) gLite CE Monitor, v1.0.2 Globus Toolkit, pre web-services, v4.0.1 + patches Globus Toolkit, web-services, v4.0.1 GLUE Schema, v1.2 draft 7 Grid User Management System (GUMS), v1.1.0
GSI-Enabled OpenSSH, v3.5Java SDK, v1.4.2_08jClarens, v0.6.0 jClarens Web Service Registry, v0.6.0 JobMon, v0.2 KX509, v20031111 Monalisa, v1.2.46 MyProxy, v2.2 MySQL, v4.0.25 Nest, v0.9.7-pre1 Netlogger, v3.2.4 PPDG Cert Scripts, v1.6 PRIMA Authorization Module, v0.3 PyGlobus, vgt4.0.1-1.13 RLS, v3.0.041021 SRM Tester, v1.0 UberFTP, v1.15 Virtual Data System, v1.4.1 VOMS, v1.6.7 VOMS Admin (client 1.0.7, interface 1.0.2, server 1.1.2), v1.1.0-r0
Our most important design decision
V-LAN
Gateway Server
Cluster
Datastore
Cluster
Gateway Server
Cluster
Datastore
Cluster
Installing Gateway Servers at all grid sites, using VM technology to
support multiple grid stacks
Gateways will support, GT2, GT4, LCG/EGEE, Data grid (SRB etc),
Production Portals, development portals, experimental grid stacks
High bandwidth, dedicated private networking between grid
sites
Gateway Systems
• Support the basic operation of the APAC National Grid and translate grid protocols into site specific actions– limit the number of systems that need grid components
installed and managed – enhance security as many grid protocols and associated
ports only need to be open between the gateways– in many cases only the local gateways need to interact
with site systems– support roll-out and control of production grid
configuration– support production and development grids and local
experimentation using Virtual Machine implementation