Changes to sources of funding for computing in the UK. Past and present computing resources. Future plans for computing developments. UK Status & Planning for LHC(b) Computing UK Status & Planning for LHC(b) Computing Andrew Halley CERN LHCb Software Week 26th November 1999. Outline Outline
19
Embed
Changes to sources of funding for computing in the UK. Past and present computing resources. Future plans for computing developments. UK Status &
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Changes to sources of funding for computing in the UK.
Past and present computing resources.
Future plans for computing developments.
UK Status & Planning for LHC(b) ComputingUK Status & Planning for LHC(b) Computing
Andrew HalleyCERN
LHCb Software Week26th November 1999.
OutlineOutline
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
2
Funding Sources for PP Computing in Funding Sources for PP Computing in the UK the UK
Until recently (last 2 years),UK funding for particle physicscomputing had two components:
Direct funding to the individual University Groups
Central funding to the IT Group of the CCLRC in Rutherford-Appleton Lab.
Enter, the Enter, the New Concept New Concept
fromfromUK GovernmentUK Government
OutcomesOutcomes
•Installed equipment small scale, but well-tailored at Universities.•Large facility @RAL but needs expt’s to motivate changes.
Get the individualexperiments and/orUniversity Groupsto bid for (big)money.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
3
New external sources of computing New external sources of computing fundingfunding
Joint Research Equipment Initiative (JREI)Joint Research Equipment Initiative (JREI) The aim of JREI is to contribute to the physical
research infrastructure and to enable high-quality research to be undertaken, particularly in areas of basic and strategic priority for science and technology, such as those identified by Foresight. £99M in 1999, the 4th round
Joint Infrastructure Fund (JIF)Joint Infrastructure Fund (JIF) £700M over three years The money will enable universities to finance
essential building, refurbishment and equipment projects to ensure that they remain at the forefront of international scientific research.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
4
Personal summary: PPARC comp. JREI Personal summary: PPARC comp. JREI bidsbids
Year Submitted Awarded Experiment Awarded Amount(K) Details?
1998 May-98 J an-99 BaBar Yes 800 Yes
1998 May-98 J an-99 LHCb Yes 275 Yes
1999 May-99 - LHCb Pending 1260.8 Some
1999 May-99 - ALI CE Pending 145.8 Not yet
1999 May-99 - D0 Pending 371.5 Some
1999 May-99 CDF Pending 1355.5 Some
Following represents a summary of what information is available from various sources, including PPARC.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
5
Personal summary: PPARC comp. JIF Personal summary: PPARC comp. JIF bidsbids
Following represents a summary of what information is available from various sources, excluding PPARC.
Year Submitted Awarded Experiment Awarded Amount(K) Details
1998 Dec-98 Postponed CDF Pending 2340 Yes
1999 Apr-99 - BaBar Pending ? Some
1999 May-99 - UKQCD Pending 6660 Some
1999 May-99 - GlasgowQCD Pending 2800 Some
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
BaBar JIF 99 - submitted April 99BaBar JIF 99 - submitted April 99 Line to SLAC computers for analysis, big SMP at RAL, smaller SMPs at each
site, Linux farm(s) for simulation.
LHCb JREI 99 - submitted May 99LHCb JREI 99 - submitted May 99 40 PCs with 1TB of disk each to store data generated by MAP and
analyse it.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
8
Timescales and deadlines for bid Timescales and deadlines for bid proceduresprocedures
Recent round needed tobe submitted by May 1999with decisions not expected before January 2000.
The 1999 bids had to be submitted by Spring 1999with decision expected around November 1999.
Next round submissiondates is 11th October 1999for the “decision point” expected to be March 2000.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
9
Current Computing Resources in the Current Computing Resources in the UKUK
Considering the central facilities currently available:
Large central datastore, combining both a large disk pool, together with backing store and tape robots.
Central CPU farms and servers currently comprising of :
CSF facility based on Hewlett-Packard processors. Windows NT facility based on Pentium processors. Upgraded Linux-CSF facility based on Pentium processors.
In addition, home Universities have considerable power in workstation clusters and dedicated farms, often “harvested”by software-”robots” which serve out tasks remotely.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
10
Current usage statistics of the RAL Current usage statistics of the RAL datastoredatastore
Typically, ~10->15 Tb accessible from the datastore,but only ~5 Tb actively used, at any recently given time,
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
11
Usage of the HP CSF facility at RALUsage of the HP CSF facility at RAL
As an example snapshot of theuse of the service, from April ‘99to September ‘99, average use is ~80%.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
12
Linux CPU utilisation by Week
0
1000
2000
3000
4000
5000
6000
13/09/99 20/09/99 27/09/99 4/10/99
Pent
ium
450
hou
rs
Linux CSF farm and its usage at RALLinux CSF farm and its usage at RAL
The Linux farm now consists of:
Forty Pentium II 450 MHz CPU’s with 256Kb memory, 10 Gb fast local disk, 100 Mb/s fast ethernet.
maximum capacity
Linux Farm Utilisation Snapshot 13/09/99-11/10/99
system1%
alephuk41%
atlas1%
cdf0%
h121%
minos0%
theory23%
zeus13%
system
alephuk
atlas
cdf
h1
minos
theory
zeus
Currently well used byactive experiments, andwith excellent potentialfor upgrades.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
13
Windows NT Farm and usage at RALWindows NT Farm and usage at RAL
Ten dual processor machines with 450 MHz CPUs added tothe farm. Upgrade increases the capacity of the farm by factor of ~5.
Service used heavily by both ALEPH and LHCb for MC production.
Will be used as part of LHCb plans to generate large numbers (106) inclusive bbar events in the near future.
Automatic job submission software set-up for LHCb
System software replication set-up so it’s now very easy to extend the system as appropriate.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
14
New computing resources outside of New computing resources outside of RALRAL
300 processors300 processors 400MHz PII 128 Mbytes memory 3 Gbytes disk D-Link 100BaseT ethernet +hubs commercial units BUT
custom boxes for packing and cooling
On the basis of the new funding arrangements in the UK,University of Liverpool was given funds to make MAP, a large MC processor based on cut-down linux nodes:
The nodes are rack mounted and running a stripped downversion of RedHat Linux 5.2.
Tailored for production using a 1 Tb local mounted disk, but needs corresponding solution for analysing the datalocally.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
15
Computing resources outside of RAL: Computing resources outside of RAL: MAPMAP
Master
Ext
ern
al E
ther
net
MAPSlaves
Hub
Hub
100BaseT
System is scalable, can beincreased by adding moreslaves, and/or network hubs.
Benefits from bulk purchaseof uniform hardware….
The idea:
and in reality:
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
16
Future plans for cpu upgrades etc.Future plans for cpu upgrades etc.
Intention is to develop the Linux farm @ RAL:
Order 30 new dual-processor 600 MHz nodes to be added to the existing cluster.
Add more hardware around April/May next financial year to keep up with demand.
Also plans to augment MAP at Liverpool with subsystemsat additional LHCb UK sites, also:
Developing COMPASS, a model for LHC analyses.
Using a fast Linux server to check large disk pool read/write speeds of 50/20 Mbs with over 1 Tb of data space attached.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
17
ServiceCentre
Future “plans” for LHC computing in Future “plans” for LHC computing in the UKthe UK
Given the new funding arrangements in the UK, and thechallenges facing us with the LHC computing needs:
CERN
Tier-1 Regional Centre
Tier-1 Regional
Centre
Tier 1Tier 1regionalregionalCentreCentre
Tier-2 Regional CentreTier-2
Regional Centre
Institutes
UK plans to operate a Tier 1 Regional Centrebased @ RAL, with severalTier 2 Centres (such asMAP/COMPASS) at the Universities.
Submission of an LHC-wideUK JIF bid for capital funding for the years through the LHC start-up.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
18
Ramping up the UK resources for the Ramping up the UK resources for the LHCLHC
The resources needed are dependent, somewhat, on thecomputing models adopted by the experiments, but arecurrently:
An additional tape robot will be purchased in 2003, to allow datastore extensions to 320 Tb. Network bandwidth to CERN is assumed to be 50 Mbs with similar performances achieved to Tier 2 centres in 2003 and increased thereafter to 500 Mbs.
26/11/99 LHCb Software Week at CERN, Andrew Halley (CERN)
19
Tentative conclusions and summary.Tentative conclusions and summary.Clearly, the field is evolving quickly. Status can be brokendown into :
•upgraded linux (NT?) farms ~doubling capacity every year or so,•increases in datastore size.
•new massive simulation facilities like MAP coming online,•analyses engines being developed to cope with generated data rates.
•development of Tier 1 and 2 data centres with 2 orders of magnitude increases in stored data & cpu power,•2-3 orders of magnitude in bandwidth improvements in network access.