A Wide Range of Scientific Disciplines Will Require a Common Infrastructure • Example--Two e-Science Grand Challenges – NSF’s EarthScope—US Array – NIH’s Biomedical Informatics Research Network • Common Needs – Large Number of Sensors / Instruments – Daily Generation of Large Data Sets – Data is on Multiple Length and Time Scales – Automatic Archiving in Distributed Federated Repositories – Large Community of End Users – Multi-Megapixel and Immersive Visualization – Collaborative Analysis From Multiple Sites – Complex Simulations Needed to Interpret Data
11
Embed
A Wide Range of Scientific Disciplines Will Require a Common Infrastructure
A Wide Range of Scientific Disciplines Will Require a Common Infrastructure. Example--Two e-Science Grand Challenges NSF’s EarthScope—US Array NIH’s Biomedical Informatics Research Network Common Needs Large Number of Sensors / Instruments Daily Generation of Large Data Sets - PowerPoint PPT Presentation
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
A Wide Range of Scientific DisciplinesWill Require a Common Infrastructure
• Example--Two e-Science Grand Challenges– NSF’s EarthScope—US Array– NIH’s Biomedical Informatics Research Network
• Common Needs– Large Number of Sensors / Instruments– Daily Generation of Large Data Sets– Data is on Multiple Length and Time Scales– Automatic Archiving in Distributed Federated Repositories– Large Community of End Users– Multi-Megapixel and Immersive Visualization– Collaborative Analysis From Multiple Sites– Complex Simulations Needed to Interpret Data
NSF’s EarthScope--USArray • Resolution of Crust & Upper Mantle Structure to Tens of kms.• Transportable Array
A LambdaGrid Will Be the Backbone for an e-Science Network
• Metro Area Laboratories Springing Up Worldwide
• Developing GigE and 10GigE Applications and Services
• Testing Optical Switches
• Metro Optical Testbeds-the next GigaPOP?
Campus Laboratory LambdaGrid “On-Ramps” are Needed to Link to MetroGrid
• TND2 = Datamining Clusters at NU and UIC Lab. for Advanced Computing – 32 Deerfield processors with 10GigE networking each, NetRam storage
• TNV2 = Visualization Clusters at NU and UIC EVL– 27 Deerfield processors with 10GigE networking each, 25 screens
• TNC2 = TeraGrid Computing Clusters at EVL– 32 Deerfield processors with 10GigE networking each
LAC
10x 10GigE
UIC
O-O-Oswitch
EVLTNV2 TNC2
router
10x10GigE
2x40GigE
2x40GigE
TND2
router
StarLight/Northwestern
TND2TNV2
router
DWDM DWDM
O-O-Oswitch
…
Source: Tom DeFanti, EVL, UIC
Research Topics for Building an e-Science LambdaGrid
• Provide Integrated Services in the Tbit/s Range– Lambda-Centric Communication & Computing Resource Allocation– Middleware Services for Real-Time Distributed Programs – Extend Internet QoS Provisioning Over a WDM-Based Network
• Develop a Common Control-Plane Optical Transport Architecture: – Transport Traffic Over Multiple User Planes With Variable Switching
Research Topics for Building an e-Science LambdaGrid
• Enhance Security Mechanisms:– End-to-End Integrity Check of Data Streams– Access Multiple Locations With Trusted Authentication Mechanisms– Use Grid Middleware for Authentication, Authorization, Validation,
Encryption and Forensic Analysis of Multiple Systems and Administrative Domains
• Distribute Storage While Optimizing Storewidth:– Distribute Massive Pools of Physical RAM (Network Memory)– Develop Visual TeraMining Techniques to Mine Petabytes of Data – Enable Ultrafast Image Rendering – Create for Optical Storage Area Networks (OSANs)
– Analysis and Modeling Tools– OSAN Control and Data Management Protocols – Buffering Strategies and Memory Hierarchies for WDM Optical
Networks
UCSD, UCI, USC, UIC, & NW
A Layered Software Architecture is Needed for Defense and Civilian Applications