Top Banner
Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)
9

Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Mar 27, 2015

Download

Documents

Miguel Compton
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Clusters, Grids and their applications in Physics

David Barnes (Astro)

Lyle Winton (EPP)

Page 2: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Today’s typical workstation can:• Compute 1024-point fast Fourier transforms at a

rate of 40000 per second.• Compute and apply gravitational force for 32768

particles at a rate of ~one time step per sec.• Render ~7M elements of a data volume per sec.• Stream data to and from disk at around 30 MByte

per sec.• Communicate with other machines at up to 10

Mbyte per sec, with a latency of a few tens of milliseconds.

… … what if this is not enough? … …

Page 3: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

High Performance Computers

• ~ 20 years ago– 1x106 Floating Point Ops/sec (Mflop/s) [scalar processors]

• ~ 10 years ago– 1x109 Floating Point Ops/sec (Gflop/s) [vector processors]

• ~ Today: superscalar-based CLUSTERS– 1x1012 Floating Point Ops/sec (Tflop/s)

• Highly parallel, distributed, networked superscalar processors

• ~ less than 10 years away– 1x1015 Floating Point Ops/sec (Pflop/s)

• Multi-level clusters and GRIDS

a+b=c

A+B=C

a+b=cd+e=f

Page 4: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

High-Performance Computing Directions: Beowulf-class PC Clusters

• Common off-the-shelf PC Nodes– Pentium, Alpha, PowerPC, SMP

• COTS LAN/SAN Interconnect– Ethernet, Myrinet, Giganet, ATM

• Open Source Unix– Linux, BSD

• Message Passing Computing– MPI, PVM– HPF

• Best price-performance• Low entry-level cost• Just-in-place configuration• Vendor invulnerable• Scalable• Rapid technology tracking

Definition: Advantages:

Enabled by PC hardware, networks and operating system achieving capabilities of scientific workstations at a fraction of the cost and availability of industry standard message passing libraries.

Slide from Jack Dongarra… let’s see some clusters …

Page 5: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Mojo: School of Physics cluster24 nodes, 24 CPUs, 2 & 2.4 GHz Pentium 4

~70 Gflop/s~70 Gflop/s

Page 6: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Swinburne Centre for Astrophysics and Supercomputing: 90 nodes, 180 CPUs, 2.0, 2.2 & 2.4 GHz Pentium 4

16 nodes, 32 CPUs, 933 MHz Pentium III

~500 Gflop/s~500 Gflop/s• Accomplish > 8 million

1024-pt FFTs per second.• Render > 109 volume

elements per second.• Calculate at one time step

per second for ~150000 particles using brute-force approach.

Page 7: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

Next-generation “blade” clusterIBM Blue Gene ~200 Tflop/s~200 Tflop/s

Page 8: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

SETI@home• Uses thousands of Internet-connected

PCs to help in the search for extraterrestrial intelligence.

• Uses data collected with the Arecibo Radio Telescope, in Puerto Rico

• When your computer is idle the software downloads a 300 kilobyte chunk of data for analysis.

• The results of this analysis are sent back to the SETI team and combined with results from thousands of other participants.

• Largest distributed computation project in existence– ~ 400,000 machines

– Averaging 26 Tflop/s

Slide from Jack Dongarra

This is more than a cluster … perhaps it is the first genuine “Grid” …

http://setiathome.berkeley.edu

Page 9: Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)

over to Lyle Winton…