Top Banner
The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008
80

The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Dec 18, 2015

Download

Documents

Sabrina Douglas
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

The University of Sunderland Cluster Computer

IET Lecture by John TindleNorthumbria Network, ICT

Group Monday 11 February 2008

Page 2: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Overview of talk

SRIF3 and Potential Vendors General Requirements Areas of Application Development Team Cluster Design Cluster System Hardware + Software Demonstrations

Page 3: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

United Kingdom – Science Research Investment Fund (SRIF)

The Science Research Investment Fund (SRIF) is a joint initiative by the Office of Science and Technology (OST) and the Department for Education and Skills (DfES). The purpose of SRIF is to contribute to higher education institutions' (HEIs) long-term sustainable research strategies and address past under-investment in research infrastructure.

Page 4: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

SRIF3

SRIF3 - 90% and UoS - 10% Project duration about two years Made operational by late

December 2007 Heriot Watt University -

coordinator

Page 5: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Potential Grid Computer Vendors

Dell – selected vendor CompuSys – SE England Streamline - midlands Fujitsu - Manchester ClusterVision - Dutch OCF - Sheffield

Page 6: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

General requirements

Page 7: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

General requirements

High performance general purpose computer

Built using standard components Commodity off the shelf (COTS) Low cost PC technology Reuse existing skills - Ethernet Easy to maintain - hopefully

Page 8: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Designed for Networking Experiments

Require flexible networking infrastructure

Modifiable under program control Managed switch required Unmanaged switch often employed

in standard cluster systems Fully connected programmable

intranet

Page 9: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

System Supports

Rate limiting Quality of service (QoS) Multiprotocol Label Switching (MPLS) VLANs and VPNs IPv4 and IPv6 supported in hardware Programmable queue structures

Page 10: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Special requirements 1 Operation at normal room

temperature Typical existing systems require

a low air inlet temperature < 5 Degrees C a dedicated server room with

airconditioning Low acoustic noise output Dual boot capability Windows or Linux in any proportion

Page 11: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Special requirements 2 continued

Concurrent processing, for example Boxes 75% cores for Windows Boxes 25% cores for Linux

CPU power control – 4 levels High resolution displays for media

and data visualisation

Page 12: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Advantages of design

Heat generated is not vented to the outside atmosphere

Airconditioning running cost are not incurred

Heat is used to heat the building Compute nodes (height 2U) use

relatively large diameter low noise fans

Page 13: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Areas of application

Page 14: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Areas of application

1. Media systems – 3D rendering2. Networking experiments

MSc Network Systems – large cohort

3. Engineering computing4. Numerical optimisation5. Video streaming 6. IP Television

Page 15: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Application cont 1

7. Parallel distributed computing8. Distributed databases9. Remote teaching experiments 10. Semantic web11. Search large image databases12. Search engine development13. Web based data analysis

Page 16: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Application cont 2

14. Computational fluid dynamics15. Large scale data visualisation

using high resolution colour computer graphics

Page 17: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

UoS Cluster Development Team From left to right

Kevin Ginty Simon Stobart John Tindle Phil Irving Matt Hinds Note - all wearing Dell tee shirts

Page 18: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

UoS Team

Page 19: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

UoS Cluster

Page 20: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Work Area At last all up and running!

Page 21: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

UoS Estates Department

Very good project work was completed by the UoS Estates Department Electrical network design Building air flow analysis

Computing Terraces Heat dissipation Finite element (FE) study and analysis Work area refurbishment

Page 22: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cluster Hardware

Page 23: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cluster Hardware

The system has been built using Dell compute nodes Cisco networking components Grid design contributions from

both Dell and Cisco

Page 24: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Basic Building Block

Compute nodes Dell PE2950 server Height 2U Two dual core processors Four cores per box Ram 8G , 2G per core http://157.228.27.155/website/CLUSTER-GRID/Dell-docs

1/

Page 25: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Computer Nodes

Network interface cards 3 off Local disk drives 250G SATA II The large amount of RAM

facilitates virtual computing experiments

VMWare server and MS VirtualPC

Page 26: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cisco 6509 switch

Cisco 6509 URL (1off) Cisco 720 supervisor engines (2off) Central network switch for the

cluster RSM router switch module Provides

Page 27: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

6509 Provides

720Mbps full duplex, (4off port cards)

Virtual LANs - VLAN Virtual private networks - VPN Link bandwidth throttling Traffic prioritisation, QoS Network experimentation

Page 28: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cluster Intranet

The network has three buses1. Data2. IPC3. IPMI

Page 29: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

1. Data bus

User data bus A normal data bus required for

interprocessor communication between user applications

Page 30: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

2. IPC Bus Inter process communication (IPC) “The Microsoft Windows operating system

provides mechanisms for facilitating communications and data sharing between applications.

Collectively, the activities enabled by these mechanisms are called interprocess communications (IPC). Some forms of IPC facilitate the division of labor among several specialized processes”.

Page 31: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

IPC Bus continued

“Other forms of IPC facilitate the division of labor among computers on a network”.

Ref Microsoft Website IPC is controlled by the OS For example IPC is

Used to transfer and install new disk images on compute nodes

Disk imaging is a complex operation

Page 32: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

3. IPMI Bus

IPMI Intelligent Platform Management

Interface (IPMI) specification defines a set of common interfaces to computer hardware and firmware which system administrators can use to monitor system health and manage the system.

Page 33: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Master Rack A

Linux and Microsoft 2 – PE2950 control nodes 5 – PE1950 web servers Cisco Catalyst 6509 720 supervisor engines 2 * 720 supervisors 4 * 48 port cards (192 ports)

Page 34: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Master Rack A cont

Compute nodes require 40*3 = 120 connections

Disk storage 1 – MD1000 http://157.228.27.155/website/CLUSTER-GRID/Dell-docs

1/

Master rack resilient to mains failure

Power supply 6 kVA APC (hard wired 24 Amp PSU)

Page 35: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Master Rack A KVM Switch

Ethernet KVM switch Keyboard, Video display, Mouse - KVM Provides user access to the head

nodes Windows head node, named – “Paddy” Linux head node, named - “Max” Movie USCC MVI_6991.AVI

Page 36: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Rack B Infiniband InfiniBand is a switched fabric

communications link primarily used in high-performance computing.

Its features include quality of service and failover and it is designed to be scalable.

The InfiniBand architecture specification defines a connection between processor nodes and high performance I/O nodes.

Page 37: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Infiniband Rack B

6 – PE2950 each with two HCAs 1 – Cisco 7000P router Host channel adapter (HCA) link http://157.228.27.155/website/CLUSTER-GRID/Cisco-doc

s1/HCA/

Infiniband http://en.wikipedia.org/wiki/InfiniBand

Page 38: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cisco Infiniband

Cisco 7000p High speed bus 10Gbits/sec Low latency < 1microsec Infiniband 6 compute nodes

24 cpu cores High speed serial communication

Page 39: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Infiniband

Many parallel channels PCI Express bus (serial DMA) Direct memory access (DMA)

Page 40: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

General compute Rack C

11 – PE2950 computer nodes Product details

Page 41: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Racks

A*1 - 2 control (+5 servers) GigE B*1 - 6 Infiniband (overlay) C*3 - 11 (33) GigE N*1 - 1 (Cisco Netlab + VoIP) Total compute nodes

2+6+33+1 = 42

Page 42: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Rack Layout

- C C B A C N - F C C B A C N F Future expansion – F KVM video - MVI_6994.AVI

Page 43: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Summary - Dell Server 2950

Number of nodes 40 + 1(lin) + 1(win) Number of compute nodes 40 Intel Xeon Woodcrest 2.66GHz Two dual core processors GigE NICs – 3 off per server RAM 8G, 2G per core Disks 250G SATA II

Page 44: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Summary - cluster speedup

Compare time taken to complete a task

Time on cluster = 1 hour Time using a single CPU = 160 hours

or 160/24 = 6.6 days approx 1 week Facility available for use by companies “Software City” startup companies

Page 45: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Data storage

Master nodes via PERC5e to MD1000 using 15 x 500G SATA drives

Disk storage 7.5T Linux 7 disks MS 2003 Server HPC 8 disks MD1000 URL http://157.228.27.155/website/CLUSTER-GRID/Dell-docs2/

Page 46: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Power

Total maximum load generated by Dell cluster cabinets

Total load = 20,742kW Values determined by using Dells

integrated system design tool Power and Noise

Page 47: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Web servers

PE1950 Height 1U Five server Web services Domain controller, DNS, DHCP etc http://157.228.27.155/website/CLUSTER-GRID/Dell-docs

1/

Page 48: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Access Workstations Dell workstations (10 off) Operating Systems WinXP Pro HD displays LCD (4 off)

Size 32 inch wall mounted Graphics NVS285 – 8*2 GPUs Graphics NVS440 – 2*4 GPU Graphics processor units Support for HDTV

Page 49: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Block Diagram

Switch 6509Servers 1950 Server 2950

NAS 7.5Tb

2 CPUs/node, 2 cores/CPU4 cores/node

Intranet data, controlmonitor, spare 250Gb SATA

Ram 8Gb

Campus network

8 * 500Gb Lin7 * 500G Win

Infiniband overlay 6 nodes2 * HCA/node10Gbps linksInfininband switch - 7000P

2 * 720 supervisor engines720 Gb duplex

40Gb per slot720 aggregate bandwidth

4 * line cards 4*48 port line cards4 * 48 = 192 ports1Gb Ethernet links copper

Support forVPNsQoSMPLSrate limitingprivate VLANsIPv4 and IPv6 routing in hardware

University of Sunderland Cluster Computer USCC

Cluster gatewayWeb server5 * 19503 * Win2003 server2 * Linuxonline support for users

Local access workstations 10 * Dell PCs

Visualsation of data, display area4 * 37 inch LCD flatscreens

Ethernet 3 LANs

GigE intranets 3 * LANS

42 * nodes, 2 * head nodes Lin/Win, 40 compute nodes

CPUs

Distributed stirage 40 * 250Gb = 10Tb

Page 50: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Movie USCC

MVI_6992.AVI

Page 51: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cluster Software

Page 52: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cluster Software

Compute Node Operating Systems Scientific Linux (based on Redhat) MS Windows Server 2003

High performance computing - HPC

Page 53: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Scali Scali Management

software to mange high performance cluster computers

Scali is used to control the cluster start and stop processes, upload data/code

and schedule tasks

Scali datasheet http://www.scali.com/ http://157.228.27.155/website/CLUSTER-GRID/Scali/

Page 54: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Other software

Apache web services Tomcat, Java server side

programming Compilers C++, Java Servers FTPD 3D modelling and animation

Blender Autodesk 3DS Max software

Page 55: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Virtual Computing

Page 56: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Virtual Network Security Experiment - example

Virtual Network VMWare Appliances Components

(1) NAT router (2) WinXP-sp2 attacks FC5 across network(3) Network hub - interconnection (4) Firewall - protection(5) Fedora Core FC5 target system

Page 57: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Network Security Experiment

VMware host

XPProSP2

Eth0

SW2

Red Greeneth1 eth0Ethernet 2 Ethernet

NAT FirewallForward port 80 from Red to FC5’s IP

FC5

Eth0

Load Apache (httpd) web server

NAT/ (VMnet8)

HUB (VMnet4)

Page 58: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Security Experiment A total of 5 virtual networking devices

using just one compute box Port scanning attack (Nessus) Intrusion detection (Snort) Tunnelling using SSH and Putty RAM required 500K+ for each

network component

Page 59: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cisco Netlab

Cisco Netlab provides Remote access to network facilities

for experimental purposes Netlab is installed the Network

cabinet Plus VoIP demonstration system for

teaching purposes

Page 60: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Network Research

Page 61: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Current ResearchNetwork Planning

Network Planning Research Network model using OOD Hybrid parallel search algorithm

based upon features of Parallel genetic algorithm (GA) Particle swarm optimisation (PSO)

Ring of communicating processes

Page 62: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Network Planning Research Web services Server side programs - JSP FTPDaemon, URL objects, XML Pan Reif solver

based on Newton’s Method Steve Turner PhD student

Submit May 2008 – first to use USCC UoS Cluster Computer USCC

Page 63: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Hybrid GA

Page 64: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Telecom Network PlanningDSL for ISP

Page 65: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

DSL Network Plan Schematic Diagram

Page 66: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Numerical output from GA optimiser – PON Equipment

Page 67: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Data visualisation - multidimensional data structure: location, time and service types

Page 68: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Demonstrations

1. IPTV2. Java test program

Page 69: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Demonstration 1 - IPTV

IP television demonstration IP internet protocol Video LAN client – VLC Number of servers and clients – 10 Video streams standard definition

4 to 5Mbps Multicasting Class D addressing

Page 70: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

IPTV IGMP

Internet group management protocol Video streams HD 16Mbps HD only uses 1.6% of 1Gbps Rudolph Nureyev dancing Six Five Special 1957

Don Lang and the Frantic Five New dance demonstration - Bunny Hop

Page 71: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Demonstration 2

Java demonstration test program Compute node processes 40 Workstation server 1 Communication via UDP Graphical display on local server of

data sent from compute nodes Network configuration – star

Page 72: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Star network

Node 1 Node 2

Node 39 Node 40

Server Node

Cluster Demonstration Program

Star

Ring

Page 73: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Cluster configuration file Description of File ipadd.txt

1 Node id 192.168.1.50 Hub server address 192.168.1.5 Previous Compute Node 192.168.1.7 Next Compute Node

192.168.1.51 Hub2 spare

Equation double val = 100 * ( 0.5 + Math.exp(-t/tau) * 0.5 * Math.sin(theta)) ;

Page 74: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Screenshot of hub server bar graph display

Page 75: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

USCC configuration Single demo in a compute node

Dirs 1+4 = 5 (top level + one per core) All compute node

40*5 = 200 Workstations 10

20*200 = 2000 Ten demos

10*2000 = 20,000 directories to set up

Page 76: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

Java program to configure cluster

Page 77: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

UoS Cluster Computer Inaugural Event

Page 78: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

UoS Cluster Computer Inaugural Event

Date: Thursday 24 April 2008 Time: 5.30pm Venue: St Peter’s Campus Three speakers (each 20minutes)

John MacIntyre - UoS Robert Starmer - Cisco San Jose TBA - Dell Computers

Page 79: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

USCC Inaugural Event

Attendance is free Anyone wishing to attend is asked

to register beforehand to facilitate catering

Contact via email [email protected]

Page 80: The University of Sunderland Cluster Computer IET Lecture by John Tindle Northumbria Network, ICT Group Monday 11 February 2008.

The End

Thank you for your attention Any questions

Slides and further information available at URL http://157.228.27.155/website/CLUSTER-GRID/