Top Banner
NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC. US Trends in Data Centre Design with NREL Examples of Large Energy Savings Understanding and Minimising The Costs of Data Centre Based IT Services Conference University of Liverpool O?o Van Geet, PE June 17, 2013
55

US Trends in Data Centre Design with NREL Examples of Large Energy Savings

Nov 12, 2014

Download

Technology

Summary
Background
Information Technology Systems
Environmental Conditions
Air Management
Cooling Systems
Electrical Systems
Other Opportunities for Energy Efficient Design
Data Center Metrics & Benchmarking
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.

US  Trends  in  Data  Centre  Design  with  NREL  Examples  of  Large  Energy  Savings  

Understanding  and  Minimising  The  Costs  of  Data  Centre  Based  IT  Services  Conference  

 University  of  Liverpool  O?o  Van  Geet,  PE  June  17,  2013        

Page 2: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

2

0"

5"

10"

15"

20"

25"

30"

35"

40"

1.00"

1.03"

1.06"

1.09"

1.12"

1.15"

1.18"

1.21"

1.24"

1.27"

1.30"

1.33"

1.36"

1.39"

1.42"

1.45"

1.48"

1.51"

1.54"

1.57"

1.60"

1.63"

1.66"

1.69"

1.72"

1.75"

1.78"

1.81"

1.84"

1.87"

1.90"

1.93"

Cost%in%M

illions%of%D

ollars%

P.U.E.%

Total%Annual%Electrical%Cost:%Compute%+%Facility%

2

Assume ~20MW HPC system & $1M per MW year utility cost

Facility

HPC

Cost  and  Infrastructure  Constraints  

Page 3: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

3

BPG  Table  of  Contents  

•  Summary  •  Background  •  Informa?on  Technology  

Systems  •  Environmental  CondiLons  •  Air  Management  •  Cooling  Systems  •  Electrical  Systems  •  Other  Opportuni?es  for  

Energy  Efficient  Design  •  Data  Center  Metrics  &  

Benchmarking  

Page 4: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

4

CPUs  ~65C  (149F)  

GPUs  ~75C  (167F)  Memory  

~85C  (185F)  

CPU,  GPU  &  Memory,  represent  ~75-­‐90%  of  heat  load  …  

s

s

Safe  Temperature  Limits  

Page 5: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

5

 Data  Center  equipment’s  environmental  condiLons  should  fall  within  the  ranges  established  by  ASHRAE  as  published  in  the  Thermal  Guidelines  book.    

Environmental  CondiLons  

ASHRAE  Reference:  ASHRAE  (2008),  (2011)    

(@  Equipment  Intake) Recommended Allowable

Temperature        Data  Centers  ASHRAE    

18°  –  27°C

 15°  –  32°C  (A1)  5°  –  45°C  (A4)  

Humidity  (RH)      Data  Centers  ASHRAE      

5.5°C  DP  –    60%    RH  and  15oC  DP  

20%  –  80%  RH

Environmental  SpecificaLons  (°C)  

Page 6: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

6

2011  ASHRAE  Allowable  Ranges  

Dry Bulb Temperature

Page 7: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

7

Psychrometric  Bin  Analysis  

0

0.002

0.004

0.006

0.008

0.01

0.012

0.014

0.016

0.018

0.02

0.022

0.024

0.026

0.028

0.03

0.032

0.034

0.036

30 40 50 60 70 80 90 100 110 120

Hum

idity

Rat

io (l

b W

ater

/lb D

ry A

ir)

Dry-Bulb Temperature (ºF)

Boulder, Colorado TMY3 Weather Data

TMY3 Weather Data

Class 1 Recommended Range

Class 1 Allowable Range

60ºF

50ºF

40ºF

Rel

ativ

e H

umid

ity

80% 60% 40% 20%

100%

70ºF

60ºF

50ºF

40ºF

Rel

ativ

e H

umid

ity

80% 60% 40% 20%

100%

80ºF

Design  Condi?ons  (0.4%):  91.2  db,  60.6  wb  

Page 8: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

8

EsLmated  Savings  

Baseline  System   DX  Cooling  with  no  economizer  

Load   1  ton  of  cooling,  constant  year-­‐round  

Efficiency  (COP)   3  

Total  Energy  (kWh/yr)   10,270  

RECOMMENDED  RANGE   ALLOWABLE  RANGE  

Results   Hours   Energy  (kWh)   Hours   Energy  (kWh)  

Zone1:    DX  Cooling  Only   25   8   2   1  

Zone2:    Mul?stage  Indirect  Evap.  +  DX  (H80)   26   16   4   3  

Zone3:    Mul?stage  Indirect  Evap.  Only   3   1   0   0  

Zone4:    Evap.  Cooler  Only   867   97   510   57  

Zone5:    Evap.  Cooler  +  Outside  Air   6055   417   1656   99  

Zone6:    Outside  Air    Only   994   0   4079   0  

Zone7:    100%  Outside  Air   790   0   2509   0  

Total   8,760   538   8,760   160  

Es0mated  %  Savings   -­‐   95%   -­‐   98%  

Page 9: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

9

Data  Center  Efficiency  Metric  

•  Power  Usage  EffecLveness  (P.U.E.)  is  an  industry  standard  data  center  efficiency  metric.  

•  The  raLo  of  power  used  or  lost  by  data  center  facility  infrastructure  (pumps,  lights,  fans,  conversions,  UPS…)  to  power  used  by  compute.  

•  Not  perfect,  some  folks  play  games  with  it.  •  2011  survey  esLmates  industry  average  is  1.8.  •  Typical  data  center,  half  of  power  goes  to  things  other  than  

compute  capability.  

9

“IT power” + “Facility power” P.U.E. =

“IT power”

Page 10: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

10

PUE  –  Simple  and  EffecLve  

Page 11: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

11

-20

0

20

40

60

80

100

0.75

0.85

0.95

1.05

1.15

1.25

1.35

1.45

Out

door

Tem

pera

ture

(°F)

PUE

Data  Center  PUE  

-20

0

20

40

60

80

100

0.75

0.85

0.95

1.05

1.15

1.25

1.35

1.45

Out

door

Tem

pera

ture

(°F)

PUE

Data Center PUE

Outdoor Temperature

Page 12: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

“I  am  re-­‐using  waste  heat  from  my  data  center  on  another  part  of  my  site  and  my  PUE  is  0.8!”  

ASHRAE  &  friends  (DOE,  EPA,  TGG,  7x24,  etc..)  do  not  allow  reused  energy  in  PUE  &  PUE  is  

always  >1.0.    

Another  metric  has  been  developed  by  The  Green  Grid  +;  ERE  –  Energy  Reuse  EffecLveness.  

h?p://www.thegreengrid.org/en/Global/Content/white-­‐papers/ERE  

Page 13: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

13

ERE  –  Adds  Energy  Reuse  

Utility

Cooling

UPS PDU

IT

RejectedEnergy

(a)

(b)(c) (d)

(f)

(e)

ReusedEnergy

(g)

Page 14: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

14

Credit:  Haselden  ConstrucLon  

•  More  than  1300  people  in  DOE  office  space  on  NREL’s  campus  

•  33,445    m2    

•  Design/build  process  with  required  energy  goals    

         50%  energy  savings  from  code           LEED  Pla?num  

•  Replicable    Process      Technologies    Cost  

•  Site,  source,  carbon,  cost  ZEB:B           Includes  plugs  loads  and  datacenter  

•  Firm  fixed  price    -­‐    US  $22.8/m2  construcLon  cost  (not  including  $2.5/m2  for  PV  from  PPA/ARRA)  

•  Opened  June  10,  2010  (First  Phase)  

DOE/NREL  Research  Support  Facility    

Page 15: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

15

RSF  Datacenter  

•  Fully  containing  hot  aisle  –  Custom  aisle  floor  and  door  seals  –  Ensure  equipment  designed  for  

cold  aisle  containment    §  And  installed  to  pull  cold  air  

Ø  Not  hot  air…  –       1.18  annual  PUE    –         ERE  =  0.9    

•  Control  hot  aisle  based  on  return  temperature  of  ~90F.  

•  Waste  heat  used  to  heat  building.  •  Outside  air  and  EvaporaLve  

Cooling  •  Low  fan  energy  design  •  176  Sq  m.  

Credit:  Marjorie  Scho?/NREL  

Page 16: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

16 16

Page 17: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

17 17

Data Center Load GROWTH (40+ kW in 2 years) since NO recharge!

Page 18: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

18

Move  to  Liquid  Cooling  

•  Server  fans  are  inefficient  and  noisy.  –  Liquid  doors  are  an  improvement  but  we  can  do  

beger!  

•  Power  densiLes  are  rising  making  component-­‐level  liquid  cooling  soluLons  more  appropriate.  

•  Liquid  Benefit  –  Thermal  stability,  reduced  component  failures.  –  Beger  waste  heat  re-­‐use  op?ons.  –  Warm  water  cooling,  reduce/eliminate  

condensa?on.  –  Provide  cooling  with  higher  temperature  coolant.  

•  Eliminate  expensive  &  inefficient  chillers.  •  Save  wasted  fan  energy  and  use  it  for  

compuLng.  •  Unlock  your  cores  and  overclock  to  increase  

throughput!  

Page 19: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

19

Liquid  Cooling  –  Overview  

Water  and  other  liquids  (dielectrics,  glycols  and  refrigerants)  may  be  used  for  heat  removal.    •  Liquids  typically  use  LESS  transport  energy    (14.36  Air  to  Water  Horsepower  ra?o  for  example  below).    

•  Liquid-­‐to-­‐liquid  heat  exchangers  have  closer  approach  temps  than  Liquid-­‐to-­‐air  (coils),  yielding  increased  outside  air  hours.  

Page 20: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

20

2011  ASHRAE  Liquid  Cooling  Guidelines  

NREL  ESIF  HPC  (HP  hardware)  using  24  C  supply,  40  C  return  –W4/W5  

Page 21: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

21

NREL  HPC  Data  Center  

Showcase  Facility  •  10MW,  929  m2    •  Leverage  favorable  climate  •  Use  direct  water  to  rack  

cooling  •  DC  manager  responsible  for  

ALL  DC  cost  including  energy!  •  Waste  heat  captured  and  used  

to  heat  labs  &  offices.  •  World’s  most  energy  efficient  

data  center,  PUE  1.06!  •  Lower  CapEx  and  OpEx.  

Leveraged  exper0se  in  energy  efficient  buildings  to  focus  on  showcase  data  center.  

Chips to bricks approach

•  Opera?onal  1-­‐2013,  Petascale+  HPC  Capability  in  8-­‐2013  

•  20-­‐year  planning  horizon    5  to  6  HPC  genera?ons.  

High  Performance  CompuLng  

Page 22: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

22

CriLcal  Data  Center  Specs  

• Warm  water  cooling,  24C    Water  much  beger  working  fluid  than  air  -­‐  pumps  trump  fans.  

  U?lize  high  quality  waste  heat,  40C  or  warmer.  

  +90%  IT  heat  load  to  liquid.  

•  High  power  distribuLon    480VAC,  Eliminate  conversions.  

•  Think  outside  the  box    Don’t  be  sa?sfied  with  an  energy  efficient  data  center  nestled  on  campus  surrounded  by  inefficient  laboratory  and  office  buildings.  

  Innovate,  integrate,  op?mize.  

Dashboards  report  instantaneous,  seasonal  and  cumulaLve  PUE  values.  

Page 23: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

23

•  Data  center  equivalent  of  the  “visible  man”  –  Reveal  not  just  boxes  with  blinky  lights,  but  the  inner  workings  of  

the  building  as  well.  –  Tour  views  into  pump  room  and  mechanical  spaces  –  Color  code  pipes,  LCD  monitors  

NREL  ESIF  Data  Center  Cross  SecLon  

Page 24: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

24

•  2.5 MW – Day one capacity (Utility $500K/yr/MW)

•  10 MW – Ultimate Capacity

•  Petaflop •  No Vapor Compression

for Cooling

Data Center

Page 25: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

25

Summer Cooling Mode

PUE – Typical Data Center = 1.5 – 2.0 NREL ESIF= 1.04 * 30% more energy efficient than your typical “green” data center

Data Center

Page 26: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

26

Winter Cooling Mode

ERE – Energy Reuse Effectiveness How efficient are we using the waste heat to heat the rest of the building? NREL ESIF= .7 (we use 30% of waste heat) (more with future campus loops)

Future Campus Heating Loop

Future Campus Heating Loop

High Bay Heating Loop

Office Heating Loop

Conference Heating Loop

Data Center

Page 27: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

27

95 deg Air

75 deg Air

•  Water to rack Cooling for High Performance Computers handles 90% of total load

•  Air Cooling for Legacy Equipment handles 10% of total Load

Data Center – Cooling Strategy

Page 28: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

28

         PUE  1.0X  -­‐-­‐  Focus  on  the  “1”  

Facility PUE

IT Power Consumption

Energy Re-use

We all know how to do this!

True efficiency requires 3-D optimization.

Page 29: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

29

Facility PUE

IT Power Consumption

Energy Re-use

We all know how to do this!

Increased work per watt Reduce or eliminate fans Component level heat exchange Newest processors are more efficient.

True efficiency requires 3-D optimization.

         PUE  1.0X  -­‐-­‐  Focus  on  the  “1”  

Page 30: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

30

Facility PUE

IT Power Consumption

Energy Re-use True efficiency requires 3-D optimization.

We all know how to do this!

Increased work per watt Reduce or eliminate fans Component level heat exchange Newest processors are more efficient.

Direct liquid cooling, Higher return water temps Holistic view of data center planning

       PUE  1.0X  -­‐-­‐  Focus  on  the  “1”  

Page 31: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

31

What’s  Next?  

ü  Energy  Efficient  supporLng  infrastructure.  ü  Pumps,  large  pipes,  high  voltage  (380  to  480)    electrical  to  rack  

ü  Efficient  HPC  for  planned  workload.  ü  Capture  and  re-­‐use  waste  heat.  

Can  we  manage  and  “opLmize”  workflows,  with  varied  job  mix,  within  a  given  energy  “budget”?    Can  we  do  this  as  part  of  a  larger  “ecosystem”?      

31 Steve Hammond

Page 32: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

32

Other  Factors  

32 5

DemandSMART: Comprehensive Demand Response Balancing supply and demand on the electricity grid is difficult and expensive. End users that provide a balancing resource are compensated for the service.

Annual Electricity Demand As a Percent of Available Capacity

50%

100%

Winter Spring Summer Fall

75%

25%

90%

4MW solar

Use waste heat

Better rates, shed load

DC as part of Campus Energy System

Page 33: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

33

ParLng  Thoughts  

•  Energy Efficient Data Centers – been there, done that –  We know how, let’s just apply best practices. –  Don’t fear H20: Liquid cooling will be increasingly prevalent.

•  Metrics will lead us into sustainability –  If you don’t measure/monitor it, you can’t manage it. –  As PUE has done; ERE, Carbon Use Effectiveness (CUE), etc. will help drive

sustainability. •  Energy Efficient and Sustainable Computing – it’s all about the “1”

–  1.0 or 0.06? Where do we focus? Compute & Energy Reuse. •  Holistic approaches to Energy Management.

–  Lots of open research questions. –  Projects may get an energy allocation rather than a node-hour allocation.

Page 34: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

34

Otto VanGeet 303.384.7369

[email protected]

   NREL  RSF  50%  of  code  energy  use  Net  zero  annual  energy  $22.8/m2    ConstrucLon  Cost    

QUESTIONS?  

Page 35: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

35

•  Thermoelectric  power  generaLon  (coal,  oil,  natural  gas  and  nuclear)  consumes  about  1.1  gallon  per  kW  hour,  on  average.  

•  This  amounts  to  about  9.6  M  gallons  per  MW  year.  

•  We  esLmate  about  2.5  M  gallons  water  consumed  per  MW  year  for  on-­‐site  evaporaLve  cooling  towers  at  NREL.  

•  If  chillers  need  0.2MW  per  MW  of  HPC  power,  then  chillers  have  an  impact  of  2.375M  gallons  per  year  per  MW.  

•  Actuals  will  depend  on  your  site,  but  evap.  cooling  doesn’t  necessarily  result  in  a  net  increase  in  water  use.    

•  Low  Energy  use  =  Lower  water  use.  Energy  Reuse  uses  NO  water!  

Water  ConsideraLons  

“We shouldn’t use evaporative cooling, water is scarce.”

NREL PIX 00181

Page 36: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

36

Data  Center  Efficiency  

•  Choices regarding power, packaging, cooling, and energy recovery in data centers drive TCO.

•  Why should we care? •  Carbon footprint. •  Water usage. •  Mega$ per MW year. •  Cost: OpEx ~ IT CapEx!

•  A  less  efficient  data  center  takes  away  power  and  dollars  that  could  otherwise  be  used  for  compute  capability.  

Page 37: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

37

HolisLc  Thinking  •  Approach  to  Cooling:  Air  vs  Liquid  and  where?  

–  Components,  Liquid  Doors  or  CRACs,  …  

•  What  is  your  “ambient”  Temperature?  –  55F,  65F,  75F,  85F,  95F,  105F  …  –  13C,  18C,  24C,  30C,  35C,  40.5C  …  

•  Electrical  distribuLon:    –  208v  or  480v?  

•  “Waste”  Heat:    –  How  hot?      Liquid  or  Air?      Throw  it  away  or  Use  it?  

Page 38: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

38

Liquid  Cooling  –  New  ConsideraLons  

•  Air  Cooling  –  Humidity      –  Fan  failures  –  Air  side  economizers,  par?culates  

•  Liquid  Cooling  –  pH  &  bacteria  –  Dissolved  solids  –  Corrosion  inhibitors,  etc.  

•  When  considering  liquid  cooled  systems,  insist  that  providers  adhere  to  the  latest  ASHRAE  water  quality  spec  or  it  could  be  costly.  

Page 39: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

39

2011  ASHRAE  Liquid  Cooling  Guidelines  

Page 40: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

40

2011  ASHRAE  Thermal  Guidelines  

2011  Thermal  Guidelines  for  Data  Processing  Environments  –  Expanded  Data  Center  Classes  and  Usage  Guidance.    White  paper  prepared  by  ASHRAE  Technical  Commi?ee  TC  9.9  

Page 41: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

41

Energy  Savings  PotenLal:  Economizer  Cooling  

Energy  savings  poten?al  for  recommended  envelope,  Stage  1:  Economizer  Cooling.12  

(Source:  Billy  Roberts,  NREL)    

Page 42: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

42

Data  Center  Energy  

•  Data  centers  are  energy  intensive  faciliLes.  –  10-­‐100x  more  energy  intensive  than  an  office.  

–  Server  racks  well  in  excess  of  30kW.  

–  Power  and  cooling  constraints  in  exis?ng  facili?es.  

•  Data  Center  inefficiency  steals  power  that  would  otherwise  support  compute  capability.  

•  Important  to  have  DC  manager  responsible  for  ALL  DC  cost  including  energy!  

 

   

Page 43: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

43

Energy  Savings  PotenLal:    Economizer  +  Direct  EvaporaLve  Cooling  

Energy  savings  poten?al  for  recommended  envelope,  Stage  2:  Economizer  +  Direct  Evap.  Cooling.12  

(Source:  Billy  Roberts,  NREL)    

Page 44: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

44

Energy  Savings  PotenLal:  Economizer  +  Direct  Evap.  +  MulLstage  Indirect  Evap.  Cooling  

Energy  savings  poten?al  for  recommended  envelope,  Stage  3:  Economizer  +  Direct  

Evap.  +  Mul?stage  Indirect  Evap.  Cooling.12  (Source:  Billy  Roberts,  NREL)    

Page 45: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

45

Data  Center  Energy  Efficiency  

•  ASHRAE  90.1  2011  requires  economizer  in  most  data  centers.  •  ASHRAE  Standard  90.4P,  Energy  Standard  for  Data  Centers  and  

Telecommunica0ons  Buildings    •  PURPOSE:  To  establish  the  minimum  energy  efficiency  requirements  

of  Data  Centers  and  TelecommunicaLons  Buildings,  for:    •  Design,  construcLon,  and  a  plan  for  operaLon  and  maintenance    •  SCOPE:  This  Standard  applies  to:    •  New,  new  addiLons,  and  modificaLons    to  Data  Centers  and  

TelecommunicaLons  Buildings  or  porLons  thereof  and  their  systems    •  Will  set  minimum  PUE  based  on  climate.  •  More  detail  at  :  h?ps://www.ashrae.org/news/2013/ashrae-­‐seeks-­‐

input-­‐on-­‐revisions-­‐to-­‐data-­‐centers-­‐in-­‐90-­‐1-­‐energy-­‐standard-­‐scope  

Page 46: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

46

1.   Reduce  the  IT  load  -­‐  VirtualizaLon  &  ConsolidaLon  (up  to  80%  reducLon)  2.    Implement  contained  hot  aisle  and  cold  aisle  layout.  

  Curtains,  equipment  configura?on,  blank  panels,  cable  entrance/exit  ports,    

3.   Install  economizer  (air  or  water)  and  evaporaLve  cooling  (direct  or  indirect).  4.   Raise  discharge  air  temperature.      Install  VFD’s  on  all  computer  room  air  

condiLoning  (CRAC)  fans  (if  used)  and  network  the  controls.  5.   Reuse  data  center  waste  heat  if  possible.  6.   Raise  the  chilled  water  (if  used)  set-­‐point.  

  Increasing  chiller  water  temp  by  1°C  reduces  chiller  energy  use  by  about  3%  

7.   Install  high  efficiency  equipment  including  UPS,  power  supplies,  etc..  8.   Move  chilled  water  as  close  to  server  as  possible  (direct  liquid  cooling).  9.   Consider  centralized  high  efficiency  water  cooled  chiller  plant  

  Air-­‐cooled  =  2.9  COP,  water-­‐cooled  =  7.8  COP  

Energy  ConservaLon  Measures  

Page 47: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

47

Equipment  Environmental  SpecificaLon    

Air Inlet to IT Equipment is the important specification to meet

Outlet temperature is not important to IT Equipment

Page 48: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

48

 Recommended  Range  (Statement  of  Reliability)  Preferred  facility  opera?on;  most  values  should  be  within  this  range.    

 Allowable  Range  (Statement  of  FuncLonality)  Robustness  of  equipment;  no  values  should  be  outside  this  range.  

MAX  ALLOWABLE  

RACK  INTAKE  TEMPERATURE  

MAX  RECOMMENDED  Over-­‐Temp  

Recommended    Range  

Under-­‐Temp  

MIN  RECOMMENDED  

MIN  ALLOWABLE  

Allowable    Range  

Key  Nomenclature  

Page 49: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

49

Improve  Air  Management  

•  Typically,  more  air  circulated  than  required.    

•  Air  mixing  and  short  circuiLng  leads  to:  –  Low  supply  temperature  –  Low  Delta  T  

•  Use  hot  and  cold  aisles.  •  Improve  isolaLon  of  hot  

and  cold  aisles.  –  Reduce  fan  energy  –  Improve  air-­‐condi?oning  

efficiency  –  Increase  cooling  capacity  

49

 Hot  aisle/cold  aisle  configuraLon  decreases  mixing  of  intake  &  

exhaust  air,  promoLng  efficiency.    

Source:  hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf  

Page 50: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

50

Isolate  Cold  and  Hot  Aisles  

Source:  hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf  

70-80ºF vs. 45-55ºF

95-105ºF vs. 60-70ºF

Page 51: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

51

Adding  Air  Curtains  for  Hot/Cold  IsolaLon  

Photo  used  with  permission  from  the  NaLonal  Snow  and  Ice  Data  Center.  h?p://www.nrel.gov/docs/fy12osL/53939.pdf  

Page 52: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

Courtesy  of  Henry  Coles,  Lawrence  Berkeley  NaLonal  Laboratory  

Page 53: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

53

Three  (3)  Cooling  Device  Categories  

ITEquipment

Rack

cooling water

rackcontainment

SERVER FRONT

1  -­‐  Rack  Cooler  •  APC-­‐water  •  Knürr(CoolTherm)-­‐water  •  Knürr(CoolLoop)-­‐water  •  Rigal-­‐water  

2  -­‐  Row  Cooler  •  APC(2*)-­‐water  •  Liebert-­‐refrigerant  

ITEquipment

Rack

ITEquipment

Rack

ITEquipment

Rack

rowcontainment

cooling water

cooling waterSERVER

FRONT

3  -­‐  Passive  Door  Cooler  •  IBM-­‐water  •  Vege/Coolcentric-­‐water  •  Liebert-­‐refrigerant  •  SUN-­‐refrigerant  

SERVER FRONT

ITEquipment

Rackcooling water

Courtesy  of  Henry  Coles,  Lawrence  Berkeley  Na0onal  Laboratory  

Page 54: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

54

“Chill-­‐off  2”  EvaluaLon  of  Close-­‐coupled  Cooling  SoluLons    

Courtesy  of  Geoffrey  Bell  and  Henry  Coles,  Lawrence  Berkeley  Na0onal  Laboratory  

less energy use

Page 55: US Trends in Data Centre Design with NREL Examples of Large Energy Savings

55

Cooling  Takeaways…  

•  Use  a  central  plant  (e.g.  chiller/CRAHs)  vs.  CRAC  units  •  Use  centralized  controls  on  CRAC/CRAH  units  to  prevent  

simultaneous  humidifying  and  dehumidifying.  •  Move  to  liquid  cooling  (room,  row,  rack,  chip)  •  Consider  VSDs  on  fans,  pumps,  chillers,  and  towers  •  Use  air-­‐  or  water-­‐side  free  cooling.  •  Expand  humidity  range  and  improve  humidity  control  (or  

disconnect).