U.S. ATLAS Software WBS 2.2 S. Rajagopalan S. Rajagopalan July 8, 2003 July 8, 2003 DOE/NSF Review of LHC Computing DOE/NSF Review of LHC Computing
Dec 21, 2015
U.S. ATLAS Software
WBS 2.2
S. RajagopalanS. Rajagopalan
July 8, 2003July 8, 2003
DOE/NSF Review of LHC ComputingDOE/NSF Review of LHC Computing
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 2
Outline
Organizational Issues ATLAS & U.S. ATLAS software
Current Affairs Current resource allocation including LCG contributions
Major milestones met
FY04 Planning Planning, coordination with international ATLAS
Near term milestones
Priorities and request for FY04
Conclusions
Organizational Issues
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 4
New Computing Organization
x
x
x
x
x
x
x
x
x
x
x
x
x
x x x
x
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 5
Computing Management Board
Coordinate & Manage computing activitiesCoordinate & Manage computing activities Set priorities and take executive decisionsSet priorities and take executive decisions
Computing Coordinator (chair)Computing Coordinator (chair) Software Project Leader (D. Quarrie, LBNL)Software Project Leader (D. Quarrie, LBNL) TDAQ LiaisonTDAQ Liaison Physics CoordinatorPhysics Coordinator International Computing Board ChairInternational Computing Board Chair GRID, Data Challenge and Operations CoordinatorGRID, Data Challenge and Operations Coordinator Planning & Resources Coordinator (T. Lecompte, ANL)Planning & Resources Coordinator (T. Lecompte, ANL) Data Management Coordinator (D. Malon, ANL)Data Management Coordinator (D. Malon, ANL)
Meets bi-weeklyMeets bi-weekly
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 6
Software Project Management Board
Coordinate the coherent development of softwareCoordinate the coherent development of software (core, applications and software support)(core, applications and software support)
Software Project Leader (chair) D. QuarrieSoftware Project Leader (chair) D. Quarrie Simulation coordinatorSimulation coordinator Event Selection, Reconstruction & Analysis Tools coordinatorEvent Selection, Reconstruction & Analysis Tools coordinator Core Services Coordinator (D. Quarrie)Core Services Coordinator (D. Quarrie) Software Infrastructure Team CoordinatorSoftware Infrastructure Team Coordinator LCG Applications Liaison (T. Wenaus, BNL)LCG Applications Liaison (T. Wenaus, BNL) Physics LiaisonPhysics Liaison TDAQ LiaisonTDAQ Liaison Sub-System: Inner Detector, Sub-System: Inner Detector, Liquid ArgonLiquid Argon, Tile, , Tile, Muon coordinatorsMuon coordinators
Liquid Argon: S. Rajagopalan (BNL), Muon: S. Goldfarb (U Mich)Liquid Argon: S. Rajagopalan (BNL), Muon: S. Goldfarb (U Mich)
Meets bi-weeklyMeets bi-weekly
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 7
US ATLAS Software Organization
Software Project (WBS 2.2)
S. RajagopalanCore Services (WBS 2.2.2)
D. Quarrie
Data Management (WBS 2.2.3)
D. Malon
Application Software (WBS 2.2.4)
F. Luehring
Software Support (WBS 2.2.5)
A. Undrus US ATLAS software WBS scrubbed, consistent with ATLASUS ATLAS software WBS scrubbed, consistent with ATLAS
Resource Loading and Reporting established at Level 4Resource Loading and Reporting established at Level 4
Major change compared to previous WBS: Major change compared to previous WBS:
Production and Grid Tools & Services moved under FacilitiesProduction and Grid Tools & Services moved under Facilities
Coordination (WBS 2.2.1)
Current Affairs
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 9
WBS 2.2.1 Coordination
David Quarrie (LBNL) :David Quarrie (LBNL) : ATLAS Software Project Manager
ATLAS Chief Architect
U.S. ATLAS Core Services Level 3 Manager
David Malon (ANL) :David Malon (ANL) : ATLAS Data Management Coordinator
U.S. ATLAS Data Management Level 3 Manager
Other U.S. Atlas personnel playing leading roles in ATLAS:Other U.S. Atlas personnel playing leading roles in ATLAS: S. Goldfarb (Muon), T. LeCompte (Planning),
S. Rajagopalan (LAr), T. Wenaus (LCG Liaison)
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 10
WBS 2.2.2 Core Services(D. Quarrie)
P. Calafiura (LBNL) :P. Calafiura (LBNL) : Framework support, Event Merging, EDM infrastructure
M. Marino (LBNL) :M. Marino (LBNL) : SEAL plug-in and component support
W. Lavrijsen (LBNL) :W. Lavrijsen (LBNL) : User interfaces, Python scripting, binding to dictionary, integration
with GANGA.
C. Leggett (LBNL) : C. Leggett (LBNL) : Conditions infrastructure, G4 Service integration in Athena,
Histogramming support. Redirected to other tasks in FY04
H. Ma, S. RajagopalanH. Ma, S. Rajagopalan (BNL) (Base Program) : EDM infrastructure (BNL) (Base Program) : EDM infrastructure
C. Tull (LBNL) C. Tull (LBNL) (PPDG) : Athena Grid Integration coordination(PPDG) : Athena Grid Integration coordination
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 11
WBS 2.2.2 Key Accomplishments
Python based user interfaces to CMT, Athena, and ROOTPython based user interfaces to CMT, Athena, and ROOT
Interval of Validity Service to allow time-based retrieval of Interval of Validity Service to allow time-based retrieval of
conditions data into transient memoryconditions data into transient memory
Support for plug-in manager in LCG/SEALSupport for plug-in manager in LCG/SEAL
gcc-3.2 support, multithreading support, pile-up support.gcc-3.2 support, multithreading support, pile-up support.
Services to upload persistent addresses for on-demand Services to upload persistent addresses for on-demand
retrieval of data objectsretrieval of data objects
Common Material Definition across sub-systems, creation Common Material Definition across sub-systems, creation
of G4 geometries from this description demonstratedof G4 geometries from this description demonstrated
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 12
WBS 2.2.3 Data Management(D. Malon)
S. Vanyachine (ANL) :S. Vanyachine (ANL) :
Database Services & Servers, NOVA database
Kristo Karr (ANL) :Kristo Karr (ANL) :
New Hire, replaces S. Eckmann
Collections, Catalogs and Metadata
Valeri Fine (BNL) :Valeri Fine (BNL) :
Integration of Pool with Athena
David Adams (BNL) :David Adams (BNL) :
Event datasets
Victor Perevotchikov (BNL) :Victor Perevotchikov (BNL) :
POOL evaluation, foreign object persistent in ROOT.
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 13
WBS 2.2.3 Key Accomplishments
ATLAS specificATLAS specific Athena-Pool conversion service prototype
Will be available to end user in July (tied to POOL release)Will be available to end user in July (tied to POOL release)
Support for NOVA database (primary source for detector description for simulation)(primary source for detector description for simulation) Support for interval of validitySupport for interval of validity NOVA automatic object generationNOVA automatic object generation Data additions, embedded MYSQL support for G4Data additions, embedded MYSQL support for G4 Authentications, access to databases behind firewallsAuthentications, access to databases behind firewalls
LCG contributionsLCG contributions Delivered POOL collections/metadata WP interface, doc & unit tests Delivered relational implementation of POOL explicit collections Delivered MYSQL and related package support Foreign object persistence
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 14
WBS 2.2.4 Application Software(F. Luehring)
Geant3 simulation supportGeant3 simulation support
Calorimeter (LAr & Tile) software incl. calibrationCalorimeter (LAr & Tile) software incl. calibration
Pixel, TRT detector simulation & digitizationPixel, TRT detector simulation & digitization
Muon reconstruction and databaseMuon reconstruction and database
Hadronic calibration, tau and jet reconstructionHadronic calibration, tau and jet reconstruction
electron-gamma reconstructionelectron-gamma reconstruction
High Level Trigger softwareHigh Level Trigger software
Physics analysis with new softwarePhysics analysis with new software
BNLBNL
ANL, BNL, Nevis Labs, U. Arizona, U. Chicago, U. Pittsburgh, SMUANL, BNL, Nevis Labs, U. Arizona, U. Chicago, U. Pittsburgh, SMU
Indiana U., LBNLIndiana U., LBNL
BNL, Boston U., LBNL, U. MichiganBNL, Boston U., LBNL, U. Michigan
U. Arizona, U. Chicago, ANL, BNL, LBNLU. Arizona, U. Chicago, ANL, BNL, LBNL
BNL, Nevis Labs, SMUBNL, Nevis Labs, SMU
U. WisconsinU. Wisconsin
U. S. ATLASU. S. ATLAS
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 15
WBS 2.2.5 Software Support(A. Undrus)
Release and maintenance of ATLAS and all associated Release and maintenance of ATLAS and all associated
external software (including LCG software, LHCb Gaudi external software (including LCG software, LHCb Gaudi
builds) at the Tier 1 Facility.builds) at the Tier 1 Facility.
Deployment of a nightly build system at BNL, CERN and Deployment of a nightly build system at BNL, CERN and
now used by LCG as well. now used by LCG as well.
Testing releases with new compilers (gcc-3.2, SUN 5.2).Testing releases with new compilers (gcc-3.2, SUN 5.2).
Software Infrastructure Team : Forum for discussions of Software Infrastructure Team : Forum for discussions of
issues related to support of ATLAS software and associated issues related to support of ATLAS software and associated
tools. A. Undrus is a member of this body.tools. A. Undrus is a member of this body.
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 16
US FY03 contribution in international contextUS FY03 contribution in international context
CategoryCategory US (FTE)US (FTE)Non-US Non-US (FTE)(FTE)
TotalTotal
(FTE)(FTE)
LCGLCG
(FTE)(FTE)
FrameworkFramework 3.253.25 0.750.75 4.04.0 1.31.3
EDMEDM 0.50.5PP + 0.5 + 0.5BB 00 1.01.0 0.00.0
Det. DescriptionDet. Description 0.00.0 1.01.0 1.01.0 0.00.0
Data ManagementData Management 4.64.6 4.04.0 8.68.6 1.21.2
GraphicsGraphics 0.00.0 0.250.25 0.250.25 0.00.0
SW InfrastructureSW Infrastructure 0.70.7 2.952.95 3.453.45 0.10.1
TotalTotal 9.059.05PP + 0.5 + 0.5BB 8.958.95 18.518.5 2.62.6
* Excludes David Quarrie & Torre Wenaus coordination role contributions* Excludes David Quarrie & Torre Wenaus coordination role contributions
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 17
LCG Application Component
US effort in SEAL : 1.0 FTE (FY03)US effort in SEAL : 1.0 FTE (FY03) Plug-in manager (M. Marino (0.75 FTE, LBNL)
Internal use by POOL now, Full integration into Athena Q3 2003Internal use by POOL now, Full integration into Athena Q3 2003
Scripting Services (W. Lavjrisen; 0.25 FTE, LBNL) Python support and integrationPython support and integration
US effort in POOL : 1.2 FTE (FY03)US effort in POOL : 1.2 FTE (FY03)
Principal responsibility in POOL collections and metadata WP D. Malon, K. Karr, S. Vanyachine (0.5 FTE) [ANL]D. Malon, K. Karr, S. Vanyachine (0.5 FTE) [ANL]
POOL Datasets (D. Adams, 0.2 FTE, BNL)D. Adams, 0.2 FTE, BNL)
Common Data Management Software V. Perevoztchikov, ROOT I/O foreign object persistence (0.3 FTE, BNL]V. Perevoztchikov, ROOT I/O foreign object persistence (0.3 FTE, BNL]
POOL mysql package and server configurations (ANL, 0.2 FTE)POOL mysql package and server configurations (ANL, 0.2 FTE)
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 18
US ATLAS contribution in LCG
People FTE US People US FTE
Total LCG hires 22 21.35
Working directly for apps area projects 17 16.55
ROOT 2 2
Grid integration work with experiments 3 2.8
Apps area project contributions from IT 4 3.3
EP/SFT not experiment specific 20 17.1
EP/SFT experiment specific 7 4.45 1 0.75
Experiments outside EP/SFT 28 11.9 10 2.7
Total directly working on apps area projects 76 53.3 11 3.45
Overall applications area total 81 58.1 11 3.45
• Contribution to Application Area onlyContribution to Application Area only
• Snapshot (June 2003) contributionSnapshot (June 2003) contribution
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 19
ATLAS interactions with LCG
Lack of manpower has made ATLAS participation weaker Lack of manpower has made ATLAS participation weaker than we would likethan we would like Little or no effort available to :
Participate in design discussions of POOL & SEAL omponents for which Participate in design discussions of POOL & SEAL omponents for which we are not directly responsible we are not directly responsible
Evaluate and test new featuresEvaluate and test new features Write ATLAS acceptance tests for POOL releases and for specifically Write ATLAS acceptance tests for POOL releases and for specifically
requested featuresrequested features Ensure that ATLAS priorities are kept prominent in LCG plans (ATLAS Ensure that ATLAS priorities are kept prominent in LCG plans (ATLAS
does this, but our voice has at times seemed not as loud as that of our does this, but our voice has at times seemed not as loud as that of our sisters)sisters)
Less development contributed in the collections/metadata work
package (for which we are responsible) than we would have liked,
though this should improve soon with recent hire at ANL
FY04 Plans
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 21
International ATLAS Planning
ATLAS has a planning officer: ATLAS has a planning officer: T. LeCompte (ANL)T. LeCompte (ANL)
The current focus is on defining the WBS and establishing The current focus is on defining the WBS and establishing
coherent short term plans.coherent short term plans. US WBS used as a starting point!
Responsibility in monitoring all deliverables including non-Responsibility in monitoring all deliverables including non-
ATLAS components (such as LCG) and assessing the ATLAS components (such as LCG) and assessing the
impact from any delays.impact from any delays.
Responsibility for establishing the software agreements and Responsibility for establishing the software agreements and
scope with international ATLAS institutions.scope with international ATLAS institutions.
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 22
ATLAS Computing Timeline2003
2004
2005
2006
2007
NOW
• Jul 03Jul 03 POOL/SEAL releasePOOL/SEAL release
• Jul 03 Jul 03 ATLAS release 7 (with POOL persistency)ATLAS release 7 (with POOL persistency)
• Aug 03 LCG-1 deploymentAug 03 LCG-1 deployment
• Dec 03 ATLAS complete Geant4 validationDec 03 ATLAS complete Geant4 validation
• Mar 04 ATLAS release 8Mar 04 ATLAS release 8
• Apr 04 Apr 04 DC2 Phase 1: simulation productionDC2 Phase 1: simulation production
• Jun 04 Jun 04 DC2 Phase 2: reconstruction (the real challenge!)DC2 Phase 2: reconstruction (the real challenge!)
• Jun 04 Jun 04 Combined test beams (barrel wedge)Combined test beams (barrel wedge)
• Dec 04 Computing Model paperDec 04 Computing Model paper
• Jul 05 Jul 05 ATLAS Computing TDR and LCG TDRATLAS Computing TDR and LCG TDR
• Oct 05 Oct 05 DC3: produce data for PRR and test LCG-nDC3: produce data for PRR and test LCG-n
• Nov 05 Computing Memorandum of UnderstandingNov 05 Computing Memorandum of Understanding
• Jul 06 Jul 06 Physics Readiness ReportPhysics Readiness Report
• Oct 06 Oct 06 Start commissioning runStart commissioning run
• Jul 07 Jul 07 GO!GO!
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 23
Major near term milestones
July to Dec 2003:July to Dec 2003: SEAL/POOL/PI deployment by LCG SEAL/POOL/PI deployment by LCG
Sept. 2003: Sept. 2003: Geant 4 based simulation releaseGeant 4 based simulation release
Dec. 2003:Dec. 2003: Validate Geant4 release for DC2 and test-beam Validate Geant4 release for DC2 and test-beam
Dec. 2003:Dec. 2003: First release of full ATLAS software chain using LCG First release of full ATLAS software chain using LCG
components and Geant4 for use in DC2 and combined test-beam.components and Geant4 for use in DC2 and combined test-beam.
Spring 2004:Spring 2004: Combined Test-Beam runs. Combined Test-Beam runs.
Spring 2004:Spring 2004: Data Challenge 2 Data Challenge 2
Principal means by which ATLAS will test and validate its proposed Principal means by which ATLAS will test and validate its proposed
Computing ModelComputing Model
Dec. 2004:Dec. 2004: Computing Model Document released Computing Model Document released
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 24
U.S. scope issues
2003-2004: Develop sufficient core software infrastructure to deploy 2003-2004: Develop sufficient core software infrastructure to deploy
and exercise a reasonable prototype of the ATLAS Computing Modeland exercise a reasonable prototype of the ATLAS Computing Model ATLAS is quite far from being able to do this
Now is not the time to sacrifice core software developmentNow is not the time to sacrifice core software development Doing so puts the TDR and hence the readiness for LHC turn-on at risk.Doing so puts the TDR and hence the readiness for LHC turn-on at risk.
U.S was asked to lead the effort in coordinating, developing and U.S was asked to lead the effort in coordinating, developing and
deploying the ATLAS architecture (from ground-zero in 1999).deploying the ATLAS architecture (from ground-zero in 1999). Leadership roles in Software Project, Architecture and Data Management. Leadership roles in Software Project, Architecture and Data Management.
+ major responsibilities - but minimal resources to work with.+ major responsibilities - but minimal resources to work with.
We are responsible to ensure the success of ATLAS architecture.We are responsible to ensure the success of ATLAS architecture. Efforts are continuing to be made in encouraging and recruiting non-US Efforts are continuing to be made in encouraging and recruiting non-US
institutions & US universities to contribute to core and leveraging from LCG.institutions & US universities to contribute to core and leveraging from LCG.
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 25
Core Software & Physicists
The presence of a strong core team in the U.S. has helped U.S. The presence of a strong core team in the U.S. has helped U.S.
physicists make significant contributions to reconstruction, simulation physicists make significant contributions to reconstruction, simulation
and physics analysis. and physics analysis. – in turn allowing them to play an influential role – in turn allowing them to play an influential role
in the overall ATLAS software program.in the overall ATLAS software program. Examples from LAr, InDet simulation and Calo, Muon reconstruction, event Examples from LAr, InDet simulation and Calo, Muon reconstruction, event
generation infrastructure, egamma, tau, jet reconstruction, calibration, generation infrastructure, egamma, tau, jet reconstruction, calibration, ……
Conversely, this has also allowed U.S. physicists to provide valuable Conversely, this has also allowed U.S. physicists to provide valuable
feedback to core software and in some cases contribute to the core feedback to core software and in some cases contribute to the core
developmentdevelopment Examples are the Event Data Model and the Detector Description efforts.Examples are the Event Data Model and the Detector Description efforts.
This harmony is necessary to allow U.S. to develop the necessary This harmony is necessary to allow U.S. to develop the necessary
expertise and effectively contribute to the physics at turn-on. expertise and effectively contribute to the physics at turn-on.
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 26
Incremental Effort: Core Services
Redirections:Redirections: C. Leggett (0.5 Calibration Infrastructure to EDM)
M. Marino (0.25 Training to SEAL/Framework)
Additions (prioritized):Additions (prioritized): + 1.0 FTE in Detector Description, WBS 2.2.2.3 (U. Pittsburgh)
New Hire to work with J. BoudreauNew Hire to work with J. Boudreau
+ 0.5 FTE in Analysis Tools support, WBS 2.2.2.5 New Hire or redirection of effortNew Hire or redirection of effort
+ 1.0 FTE in Graphics, WBS 2.2.2.4 (UC Santa Cruz) Existing person (G. Taylor) who is currently making significant Existing person (G. Taylor) who is currently making significant
contributions to ATLANTIS (Atlas Graphics Package).contributions to ATLANTIS (Atlas Graphics Package).
Dec
rea s
i ng
Pri
or i
t y
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 27
Detector Description
ATLAS lacked a Detector Description ModelATLAS lacked a Detector Description Model Numbers hardwired in reconstruction, no commonality with simulation.Numbers hardwired in reconstruction, no commonality with simulation.
Along came Joe Boudreau (U. Pittsburgh) CDF experienceAlong came Joe Boudreau (U. Pittsburgh) CDF experience Successfully designed, developed and deployed a prototype model
for both material and readout geometry. We encouraged this! Automatically handles alignments, Optimized for memory (Automatically handles alignments, Optimized for memory (5 MB5 MB for for
describing ATLAS geometry), Not coupled to visualization software.describing ATLAS geometry), Not coupled to visualization software.
Currently resident at Oxford, helping sub-systems migrate.
No surprise, the work load on Joe has increasedNo surprise, the work load on Joe has increased Critical items include Material Integration Service, Configuration
Utility, Identifiers and Transient Model for readout geometry
Important to support such university based initiatives to core softwareImportant to support such university based initiatives to core software
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 28
Incremental Effort: Data management
Our plan has always been to sustain 6.5 FTE effort.Our plan has always been to sustain 6.5 FTE effort.
Recent Cuts in 2002 …Recent Cuts in 2002 … Ed Frank, U. Chicago BNL Hire : job offered but retracted due to last minute budget cuts
… … have impacted our ability to deliver the promisedhave impacted our ability to deliver the promised Unable to save and restore objects from persistent event store No ATLAS interfaces to Event collections, catalogs and metadata
Approximate allocation of new effort:Approximate allocation of new effort: + 1.0 FTE Collections, Catalogs, and Metadata (WBS 2.2.3.5) + 1.0 FTE Common Data Management Software (WBS 2.2.3.2) + 0.5 FTE Event Store (WBS 2.2.3.3)
Redirect from WBS 2.2.3.1 & 2.2.3.5 (0.5 each) if no funds available.Redirect from WBS 2.2.3.1 & 2.2.3.5 (0.5 each) if no funds available.Dec
rea s
i ng
Pr i
ori
ty
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 29
Impact of Insufficient Funds
-1.0 FTE in Graphics-1.0 FTE in Graphics Impacts our ability to have any reasonable visualization software for test-beam or Impacts our ability to have any reasonable visualization software for test-beam or
Data Challenge 2.Data Challenge 2.
- 0.5 FTE in Analysis Tools- 0.5 FTE in Analysis Tools Impacts our ability to deliver a framework for analysisImpacts our ability to deliver a framework for analysis
- 1.0 FTE in Data Management- 1.0 FTE in Data Management 0.5 FTE for supporting Non-Event Data Management.0.5 FTE for supporting Non-Event Data Management. 0.5 FTE in supporting basic database services0.5 FTE in supporting basic database services
- 1.0 FTE in Detector Description- 1.0 FTE in Detector Description Jeopardizes our ability to deliver key components including Material Service Jeopardizes our ability to deliver key components including Material Service
Integration, common geometry for simulation and reconstruction,Integration, common geometry for simulation and reconstruction,
- 1.0 FTE in Common Data Management Software- 1.0 FTE in Common Data Management Software Impacts contributions to POOL and integration aspects, schema managementImpacts contributions to POOL and integration aspects, schema management
- 0.5 FTE in Event Store- 0.5 FTE in Event Store Support for a persistent EDM and Event SelectionSupport for a persistent EDM and Event Selection
De
sc o
pi n
g O
r de
rD
es
c op
i ng
Ord
er
Model 4Model 4
Model 5Model 5
Model 6Model 6
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 30
FY04 Ramp-Up Cost
2000
2200
2400
2600
2800
3000
3200
Prioritized incremental Ramp-Up in FTE
Co
st (
FY
04 k
$)
FY04 guidanceFY04 guidance
from J. Shankfrom J. Shank
+0+0 +0.5+0.5 +1.5+1.5 +2.5+2.5 +3.5+3.5 +4+4 +5+5
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 31
WBS-Personnel Summary
WBS Description FY03 LCG(03) FY04 LCG(04)
2.2 Software 12 2.6 16.5 4.3
2.2.1 Coordination 2 0.3 2 0.32.2.1.1 Software Project 1 0.1 1 0.12.2.1.2 Data Management 1 0.2 1 0.2
2.2.2 Core Services 3.75 1 6.5 22.2.2.1 Framework 3.25 1 2.6 1.52.2.2.2 EDM Services 0.5 1.22.2.2.3 Det. Description 0 1 0.52.2.2.4 Graphics 0 12.2.2.5 Analysis Tools 0 0.52.2.2.6 Grid Integration 0 0
2.2.3 Data Management 3.6 1.2 6.5 22.2.3.1 DB Services & Servers 0.5 0.2 0.5 02.2.3.2 Common Data Mgmt software 1.1 0.3 2 0.52.2.3.3 Event Store 0.3 0.2 1 0.52.2.3.4 Non-Event Data Management 0.9 12.2.3.5 Collections, Catalogs, Metadata 0.8 0.5 2 1
2.2.4 Application Software 1.4 0.3
2.2.5 Software Support 1.25 0.1 1.2 0.1
DOE/NSF Review of LHC Computing, 8 July 2003DOE/NSF Review of LHC Computing, 8 July 2003S. RajagopalanS. Rajagopalan 32
Conclusions
Request for a + 5 FTE in FY04:Request for a + 5 FTE in FY04: 2.5 FTE to bring Data Management to its intended Level of Effort
1 FTE university based for Detector Description
0.5 FTE for contribution to Analysis Tools
1 FTE university based for support Graphics
Guidance given for FY04 can handle only 1.5 FTEGuidance given for FY04 can handle only 1.5 FTE
U.S. ATLAS LCG contribution will be 4.0 FTE in FY04U.S. ATLAS LCG contribution will be 4.0 FTE in FY04 2.0 FTE each in Core Services and Data Management WP