This work was performed under the auspices of the U.S.
Department of Energy by Lawrence Livermore National
Laboratory under Contract DE-AC52-07NA27344.
LLNL Data Center Consolidation Initiative
October 29, 2013
Anna Maria Bailey, PE
LLNL-PRES-457823
Lawrence Livermore National Laboratory
Mission
Continue to develop projects to
improve optimization of LLNL’s data
centers and reduce energy intensity
Developed in 2009
Achievements
• DOE FEMP 2009 Energy Award
• B453 LEED Gold Certified 2009
• B451 LEED Silver Certified 2011
• B112 SVLG Recognition 2012
— Collaboration
– Lawrence Berkeley Laboratory
– Romonet
– Syska Hennessey
• B112 Uptime Green IT Nominee 2013
• Patented HPC Modular Solutions 2013
Benchmarking
Computational fluid dynamics -CFD
LEED certifications
Consolidations
Liquid cooling advances
Innovative electrical and
mechanical distribution
Metering and sub-metering
Lighting control systems
Integrated control systems
Lower power usage effectiveness
(PUE)
Power management systems
Ma
ste
r P
lan
Co
re C
om
pe
ten
cie
s
2
Lawrence Livermore National Laboratory
Increase efficiencies and decrease redundant labor support
Reduce footprint, water usage and energy intensity site wide
Reduce facility and network infrastructure
Repurpose redundant equipment
Clear out spaces that are not properly utilized
Right size server maintenance contracts
Standardize IT support model to “One Lab”
Continue to virtualize services
Address FY 2012 DOE Sustainability Requirements
• Data Center: > 500SF and > 1 server
• Fully meter by FY15 all enduring data centers
• Obtain a LLNL site wide weighted average PUE of 1.4 by FY15
— Consolidation provides $40M cost avoidance to achieve metering
requirements
3
Lawrence Livermore National Laboratory 4
LLNL is one square mile
61 rooms site wide
29 facilities
High Performance Computing (HPC) Simulation
13 rooms
HPC Data Analytics
4 rooms
Enterprise Computing
44 rooms
★
HPC
Enterprise
Lawrence Livermore National Laboratory
Initial Condition - 61 rooms identified and surveyed
23 Enterprise rooms have been shutdown to date since survey began in 2011
• Consolidated into enduring facilities
• Virtualized various applications
• 26,210 SF total space shutdown (7,790 SF cold and dark and balanced
repurposed)
• $305,308 annual energy savings
• $43,547 annual maintenance savings on cold and dark rooms
• Over 500 servers consolidated into B-112 Enterprise Data Center
Remaining Rooms
• 13 HPC Simulation
• 4 HPC Data Analytics
• 21 Enterprise
— Nearly 4000 servers reside outside of B-112
— Classified and Unclassified
5
Lawrence Livermore National Laboratory 6
FY 12 – FY 15 Projects
13 Enterprise rooms to close
Repurpose 6 Enterprise rooms to secure network support and infrastructure
Repurpose B-115
HPC to Enterprise Computing
Lawrence Livermore National Laboratory 7
*Classified Center
FY12 – FY13 FY14
Data
Center
Building/
Room
Size
(Sq. Ft.)
Annual
Energy
Savings
Annual
Facility
Maintenance
Savings
Facility and
Operations
@165k
SQ.
< 800 =.5
800 /1500 = 1
>1500 = 1.5
# of Devices
to Consolidate
to
B-112
Data Center
Building/
Room
Size
(Sq.
Ft.)
Annual
Energy
Savings
Annual
Facility
Maintenance
Savings
Facility and
Operations
@165k
SQ.
< 800 =.5
800 /1500 = 1
>1500 = 1.5
# of Devices to
Consolidate to
B-112
23 RMs 26,210 305k 43k 3
(FTE) were
realigned
500 111/170 842 10k 1k 1 24
191/2210 600 15k 1k .5 14
543/1073 1742 15k 3k 1.5 40
132n/1470 743 10k 1k .5 91
Total 26,21
0
305k 43k 500 Total 3927 50k 6k 577K 169
Lawrence Livermore National Laboratory 8
*Classified Center
FY14 – FY15 FY16
Data
Center
Building/
Room
Size
(Sq.
Ft.)
Annual
Energy
Savings
Annual
Facility
Maintenance
Savings
Facility and
Operations
@165k
SQ.
< 800 =.5
800 /1500 = 1
>1500 = 1.5
# of Devices
to
Consolidate
to
B-112
Data
Center
Building/
Room
Size
(Sq.
Ft.)
Annual
Energy
Savings
Annual
Facility
Maintenance
Savings
Facility and
Operations
@165k
SQ.
< 800 =.5
800 /1500 = 1
>1500 = 1.5
# of Devices to
Consolidate to
B-112
121/1146 672 13k 2k .5 137 5 6240 56k 15k 5 2400
121/1347 2000 29k 3k 1.5
131/2280 1956 41k 3k 1.5 114 `
170/1008* 973 20k 2k 1 78
111/178* 537 11k 1k .5 5
132/2506* 2204 48k 5k 1.5 15
Total 8342 162k 16k 1M 349 Total
6240 56k 15k 825k 2400
Lawrence Livermore National Laboratory
Standardize: Reduce labor costs resulting from fewer facilities and configurations
Virtualize: Reduce costs for hardware, space and utilities
Software: Reduce licensing costs for unused licenses
Hardware: Better accountability of assets
Security: Timely system patches
Best Practices from IT Community
• Reduce costs for paid vendor technical support
• Reduce costs for training, especially with travel
Integrated Systems Management
— Reduce personnel costs
— Lower utility costs
Vendor Financing (Managed Lifecycle and Leasing)
— Conserves expenditures by spreading purchase cost over time
9
Survey
Measure
Improve
Lawrence Livermore National Laboratory
Savings to Date - $348K
Projected FY14/FY15 Savings - $1.8M
Projected FY16- FY18 Savings - $896k
$600k of additional cost saving is projected through maintenance (hardware/network) contract consolidate and 30% of the physical servers being virtualized
Upon completion, approximately 45,000 SF will be shutdown or repurposed
Total Savings $3.6M
10
Lawrence Livermore National Laboratory
• B-115
— HPC has outgrown facility and will
consolidate to other Livermore
Computing facilities
— Identified as enduring for Enterprise
Computing (Backup unclassified
and primary classified) until other
facilities can be constructed
— FY13 - West and east rooms
• Repurposing equipment from
closed data center rooms
— FY14 Remainder of consolidation
• Study underway
11
Lawrence Livermore National Laboratory
Future HPC generation systems require
major facility innovation
• Liquid cooling advances
• Efficient electrical distribution
• Robust structural solutions
25 Year Master Plan
• HPC Facility Gap Analysis
— Systems spread across facilities that
range from 9 to 60 years old
• Site-wide consolidations
• Shared resources with other Laboratories
• Sustainable HPC modular solutions
Lawrence Livermore National Laboratory
Lawrence Livermore National Laboratory
Developed a process to access probability of siting future extreme scale systems to
meet mission by system class. Facilities were evaluated for modifications required.
Ranked building with system class versus required infrastructure modifications:
Structural (S), Electrical (E) and Mechanical (M)
Lawrence Livermore National Laboratory
Utilize existing facilities based on their enduring capabilities
Demolish or repurpose aging facilities that cannot meet mission
Build new facilities to meet mission
— Focus on sustainability and energy efficiency through flexibility
— Scale footprint with the computational technology
— Deploy innovative HPC facility design methodologies
Lawrence Livermore National Laboratory
• Clear, unencumbered space
• Scalable mechanical and electrical infrastructure
• Advance liquid cooling options
• Deploy free cooling
• Cool roof
• Accommodate increased weights
• Decrease space impediments
Lawrence Livermore National Laboratory
FUTURE HPC
UNCLASSIFIED
FACILITY 2015 FUTURE HPC
SRD FACILITY
2018
FUTURE HPC HIGH
SIDE DATA
ANALYTICS AND
ENTERPRISE
FACILITY 2020
112
Lawrence Livermore National Laboratory
Benchmarking
Computational fluid dynamics -CFD
LEED certifications
Consolidations
Liquid cooling advances
Innovative electrical and
mechanical distribution
Metering and sub-metering
Lighting control systems
Integrated control systems
Lower power usage effectiveness
(PUE)
Power management systems
Ma
ste
r P
lan
Co
re C
om
pe
ten
cie
s
18
Laboratory “Mission” comes first
Understand the obstacles and challenges
Develop a strategic vision
Obtain senior management support
Provide continual leadership
Remain patient
Stay focused
Continue to survey, measure and improve
Lawrence Livermore National Laboratory
Anna Maria Bailey
Lawrence Livermore National Laboratory
7000 East Ave PO Box 808 L-554
Livermore, CA 94550
Phone - (925) 423-1288 / Email – [email protected]
Andy Ashbaugh
Lawrence Livermore National Laboratory
7000 East Ave PO Box 808 L-720
Livermore, CA 94550
Phone - (925) 422-1137 / Email – [email protected]
Marriann Silveira
Lawrence Livermore National Laboratory
7000 East Ave PO Box 808 L-554
Livermore, CA 94550
Phone - (925) 423-5049 / Email – [email protected]