Top Banner
To 30kW and Beyond: High-Density Infrastructure Strategies that Improve Efficiencies and Cut Costs
51
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: High Density Cooling

To 30kW and Beyond:High-Density Infrastructure Strategies that

Improve Efficiencies and Cut Costs

Presenter
Presentation Notes
Thom: Hello, everyone and welcome to our Emerson Network Power 2010 Business Innovators Series webcast: “To 30 Kilowatts and Beyond: High Density Infrastructure Strategies that Improve Efficiencies and Cut Costs”  I’m Thom Gall, and I’ll be your host. As you know, our Web cast program is certified by the International Association for Continuing Education and Training. So attending a 1-hour Web cast qualifies you to earn one-tenth of a CEU training credit. Just check the box in your survey following the Web cast, and you’ll receive your certificate within 2-3 weeks. Today, we’ll examine how high density infrastructure strategies can actually equal lower costs and help improve efficiency and availability. This presentation deck will be available to download after the Web cast – or you can click the “download slides” button, on the bottom right-hand side of your console. I’ll give a brief introduction of Emerson Network Power and its Liebert products and services, then we will move on to our presenter.
Page 2: High Density Cooling

© 2010 Emerson Network Power

Emerson Network Power: The global leader in enabling Business-Critical Continuity

Emerson Technologies

Uninterruptible Power Power Distribution Surge Protection Transfer Switching DC Power Precision Cooling High Density Cooling Racks Rack Monitoring Sensors and Controls KVM Real-Time Monitoring Data Center Software

Presenter
Presentation Notes
Emerson Network Power is an Emerson business and the global leader in enabling Business-Critical Continuity. That means they provide the technology that powers and protects the critical systems business depends on, like servers and communications equipment. Through its Liebert AC power, precision cooling and monitoring products and services, Emerson Network Power delivers *Efficiency Without Compromise* by helping customers to optimize their data center infrastructures so as to reduce costs and deliver high availability.
Page 3: High Density Cooling

© 2010 Emerson Network Power

Emerson Network Power –An organization with established customers

Presenter
Presentation Notes
Through their solutions and their service organization, Emerson Network Power helps every company in the Fortune 500 keep its systems running. Here is just a sampling of some of the recognizable organizations that rely on Emerson Network Power to keep their business in business. Emerson Network Power has customers in virtually every industry, including telecommunications, computing, healthcare, transportation, manufacturing, web-hosting, and banking and finance.
Page 4: High Density Cooling

© 2010 Emerson Network Power

• Emerson Network Power overview

• “High Density Equals Lower Cost: High Density Design Strategies for Improving Efficiency and Performance,” Steve Madara, Vice President and General Manager, Liebert North America Precision Cooling, Emerson Network Power

• “Sandia National Laboratories’ Energy Efficient Red Sky Design,” David Martinez, Facilities Coordinator, Computing Infrastructure and Support Operations, Sandia National Laboratories

• Question and Answer session

Presentation topics

Presenter
Presentation Notes
Now a little bit about how our Web cast will run. In just a minute, I’ll introduce Steve Madara of Emerson Network Power, who will discuss cost-saving high density design strategies; and Steve will be followed by David Martinez of Sandia National Laboratories, who will share with how Sandia reduced the resource consumption and carbon footprint of their data center by putting these strategies into action. You can ask questions through the console throughout the Web cast, and we’ll select several to be answered during our closing Q&A. To ask a question, click the questions tab on the bottom of your screen. Type your question in the box along with your company name and submit it.
Page 5: High Density Cooling

High Density Equals Lower Cost: High Density Design Strategies for Improving Efficiency and Performance

Steve MadaraVice President and General Manager Liebert North America Precision CoolingEmerson Network Power

© 2010 Emerson Network Power

Presenter
Presentation Notes
Thom: First up today is Steve Madara, vice president and general manager for Emerson Network Power’s Liebert Precision Cooling Business in North America. Steve has more than 30 years of experience in the air conditioning industry-- and it was under Steve’s leadership that Emerson Network Power pioneered high-density data center cooling strategies with the introduction of the Liebert XD family of products. This strategy paved the way for rapid adoption of blade servers and other technologies that are changing not only how data centers perform, but also how they’re designed and managed. Welcome Steve. Steve: Thanks for having me, Thom. (ad lib) Thom: Steve, when we first started talking about high density strategies a few years ago, it was mostly to help our viewers resolve issues related to hot spots in the data center-- but it seems like today, high density strategies are increasingly becoming an overall design philosophy, rather than a solution for hot spots. Steve: Thanks Thom… (begin presentation) … to support lower costs.
Page 6: High Density Cooling

© 2010 Emerson Network Power

• Industry trends and challenges• Three strategies for enhancing cooling efficiency• High density equals lower cost

Agenda

Presenter
Presentation Notes
…can really equal lower cost.
Page 7: High Density Cooling

© 2010 Emerson Network Power

Industry Trends and Challenges

Presenter
Presentation Notes
…trends happening in the industry,
Page 8: High Density Cooling

© 2010 Emerson Network Power

Traditional cooling unit

• Server design– Higher ∆T across the server raising leaving air temperatures

Trends and challenges

Presenter
Presentation Notes
…get into that later on in the presentation
Page 9: High Density Cooling

© 2010 Emerson Network Power

• Regulatory – ASHRAE 90.1 Standard – 2010 Revisions– Refrigerant legislation

• Discussion around lower dew point limits• Raising cold aisle temperatures

– Intended to improve cooling capacity and efficiency– Monitor the impact on server fans

• No cooling

Trends and challenges

Presenter
Presentation Notes
….free cooling, and we’ll see the benefits of that.
Page 10: High Density Cooling

© 2010 Emerson Network Power

Spring 2010 Data Center Users’ Group survey

Average: ~8kW

Average: 10-12kW

4% 28%

9%

36%

30%

12%

21%

4%

15%

3%

4%

1%

3%3%

11%

15%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Now

In two

years

Average kW per Rack

5%

3%

20%

10%

23%

20%

16%

16%

9%

14%

8%

7% 11%

11%

19%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Now

In two

years

Maximum kW per Rack

2 kW or less >2 — 4 kW >4 — 8 kW>8 — 12 kW >12 — 16 kW >16 — 20 kW>20 — 24 kW Greater than 24 kW Unsure

11% Average: >12kW

Average: 14-16kW

Presenter
Presentation Notes
….and the trend is gonna continue.
Page 11: High Density Cooling

© 2010 Emerson Network Power

40%

26%

23%

23%

23%

18%

16%

13%

2%

0% 5% 10% 15% 20% 25% 30% 35% 40% 45%

Experienced hot spots

Run out of power

Experienced an outage

Experienced a “water event”

N/A — Have not had any issues

Run out of cooling

Run out of floor space

Excessive energy costs

Other

Spring 2010 Data Center Users’ Group survey: Top data center issues

Presenter
Presentation Notes
…and energy is becoming more and more important.
Page 12: High Density Cooling

© 2010 Emerson Network Power

14%

16%

16%

28%

5%

7%

1%

11%

6%

5%

4%

5%

11%

2%

5%

3%

6%

3%

14%

14%

23%

38%

24%

44%

14%

38%

18%

10%

7%

11%

14%

18%

9%

10%

18%

18%

28%

31%

22%

6%

35%

18%

47%

13%

35%

29%

28%

24%

3%

16%

18%

25%

15%

20%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Already implemented Plan to implementStill considering Considered, but decided againstWill not consider Unsure

Solar array

DC power

Spring 2010 Data Center Users’ Group survey: Implemented or considered technologies

Fluid economizer on chiller plant

Fluid economizer using dry coolers

Air economizer

Cold aisle containment

Containerized/ modular data center

Wireless monitoring

Rack based cooling

Presenter
Presentation Notes
…and looking at rack-based cooling.
Page 13: High Density Cooling

© 2010 Emerson Network Power

14%

16%

16%

28%

5%

7%

1%

11%

6%

5%

4%

5%

11%

2%

5%

3%

6%

3%

14%

14%

23%

38%

24%

44%

14%

38%

18%

10%

7%

11%

14%

18%

9%

10%

18%

18%

28%

31%

22%

6%

35%

18%

47%

13%

35%

29%

28%

24%

3%

16%

18%

25%

15%

20%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Already implemented Plan to implementStill considering Considered, but decided againstWill not consider Unsure

Solar array

DC power

Spring 2010 Data Center Users’ Group survey: Implemented or considered technologies

Fluid economizer on chiller plant

Fluid economizer using dry coolers

Air economizer

Containerized/ modular data center

Wireless monitoring

Cold aisle containment

Rack based cooling

Presenter
Presentation Notes
…some areas where it’s very, very effective.
Page 14: High Density Cooling

© 2010 Emerson Network Power

• Legacy data center designs with poor airflow management resulted in significant by-pass air– Typical CRAC unit has a ΔT of around 21 F at 72 F return air by design– Low ΔT servers and significant by-pass air resulted actual lower CRAC

ΔT and thus lower efficiency

72 F

78 F60 F

58 F

CRAC

ΔT -16 F

CFM –100%

Server4 kw

ΔT – 18 F

CFM –70%

By-pass air – 30% at 60 F Design ΔT - 21 FOperating ΔT - 16 F

Impact of design trends on the cooling system

Presenter
Presentation Notes
…eliminate that waste, or that bypass air.
Page 15: High Density Cooling

© 2010 Emerson Network Power

• Even with improved best practices, newer servers can create challenges in existing data centers– Server ΔT greater than CRAC unit ΔT capability requires more airflow

to meet the kW server load– Requires even more careful management of the return air to the cooling

unit as a result of the high exiting temperatures at the server.

75 F

91 F56 F

54 F

CRAC

ΔT -21 F

CFM –100%

Server8 kw

ΔT – 35 F

CFM –60%

By-pass air – 40% @ 56F Hot aisle75 F

Impact of higher temperature rise servers

Presenter
Presentation Notes
…more and more efficient with the cooling system.
Page 16: High Density Cooling

© 2010 Emerson Network PowerExpertise & Support

Improving performance of the IT infrastructure and

environment

Balancing high levels of

availability and efficiency

Adapting to IT changes for continuous

optimization and design flexibility

Delivering architectures from 10-60kW/rack to minimize

space and cost

Introducing Efficiency Without Compromise

Presenter
Presentation Notes
…whether it’s the power or the cooling systems.
Page 17: High Density Cooling

© 2010 Emerson Network Power

Three Strategies for Enhancing Cooling Efficiency

Presenter
Presentation Notes
…strategies for enhancing cooling efficiency within the data center.
Page 18: High Density Cooling

© 2010 Emerson Network Power

• Higher return air temperatures increase the cooling unit capacity and efficiency

• Increases the CRAC unit ΔT• Increases the SHR

Strategy 1: Getting the highest temperature to the cooling unit

Presenter
Presentation Notes
…more efficient as we go forward here.
Page 19: High Density Cooling

© 2010 Emerson Network Power

CRAC CRACCRAC

Rac

k

Rac

k

Rac

k

Rac

k

Rac

k

Rac

k

Rac

k

Rac

k

• Impacted by– The length of the rack rows (long

rows difficult to predict)– Floor tile placement– Rack consistency and load

• Improved with– Ducted returns– Local high density modules

With containment, hot air wrap-around is eliminated and server supply

temperatures are controlled

• Containment can be at the room or zone level

• Supply air control provides consistency between containments

• Can be used in combination with high density modules or row based cooling

Without containment, hot air wrap-around will occur and will limit max

return temperatures

Raising the temperature at the room level

CRAC CRACCRAC

Presenter
Presentation Notes
…make the CRAC unit much more efficient.
Page 20: High Density Cooling

© 2010 Emerson Network Power

Strategy 2: Providing the right cooling and airflow

• Efficiency is gained when:Server Load (kW) = Cooling Unit Capacity (kW)

Server Airflow = Cooling Unit Airflow• Challenges

– Rising server ΔT result in higher by-pass air to meet the cooling load

– Requires variable speed fans and independent control of the airflow from the cooling capacity

Presenter
Presentation Notes
…depending on the operation of the server.
Page 21: High Density Cooling

© 2010 Emerson Network Power

• Reduce cooling unit fan power • Maximize use of economization mode to reduce compressor hours of

operation (chiller / compressor)• Improve condenser heat transfer and reduce power• Direct cooling – eliminate fans and compressor hours

Strategy 3: Provide the most efficient heat rejection components

Presenter
Presentation Notes
…in total, and we’ll talk about that towards the end here.
Page 22: High Density Cooling

© 2010 Emerson Network Power

High Density Equals Lower Cost

Presenter
Presentation Notes
…how does high density equal lower cost?
Page 23: High Density Cooling

© 2010 Emerson Network Power

High density equals lower cost

Adding 2000 kW of IT at 20kW/rack vs. 5kW/rack• Smaller building for high density• Fewer racks for high density • Capital equipment costs more• Equipment installation costs higher• High density cooling is more efficient

Cost DifferenceLow Density vs. High Density

Building Capital Costs @ $250/sq. ft. ($1,875,000)Rack and PDU Capital Costs @ $2,500 each ($750,000)

Cooling Equipment Capital Costs $320,000 Installation Costs $750,000

Capital Cost Total ($1,555,000)Cooling Operating Costs (1yr) ($420,480)

Total Net Savings of a High Density Design ($1,975,480)5 yr Total Net Savings of High Density ($3,657,400)

Presenter
Presentation Notes
…provisioning for higher density really drive that lower cost.
Page 24: High Density Cooling

© 2010 Emerson Network Power

Liebert XD

CoolingUnit

Fan power- 8.5kW per 100 kW of cooling

Average entering air temperature of 72-

80 F

Traditional Cooling

Fan power- 2 kW per 100 kW of cooling

Average entering air temperature of

95-100 FChiller Capacity

0.00

0.20

0.40

0.60

0.80

1.00

1.20

1.40

Traditional CW CRAC CW Enclosed Rack Refrigerant Modules

kW C

hille

r Cap

acity

per

kW

of S

ensi

ble

Hea

t Loa

d Latent LoadFan LoadSensible Load

Total cost of ownership benefits

• 65% less fan power• Greater cooling coil effectiveness• 100% sensible cooling• 20% less chiller capacity required• Overall 30% to 45% energy savings

yielding a payback down to 1 year

Presenter
Presentation Notes
…down to about one year.
Page 25: High Density Cooling

© 2010 Emerson Network Power

Solutions transforming high density to high efficiency

XDC or XDPXDV10

Base Infrastructure

(160 kW)

Future pumping units of larger capacity and

N+1 capability

Standard Cooling Modules 10-35 kW

XDO20

Embedded Cooling in

Super Computers

(microchannelintercoolers)

35 – 75kW

Component Cooling without

Server Fans20 - 40 kW

Capable > 100 kW

Rear Door

10-40 kW

New & Future Product Configurations

XDR10/40

XDS20

XDH20/32

Presenter
Presentation Notes
…without compromising the performance of the cooling facility.
Page 26: High Density Cooling

© 2010 Emerson Network Power

• Passive rear door cooling module – No cooling unit fans – server air flow only – Optimal containment system– Allows data centers to be designed in a “cascading”

mode• Simplifies the cooling requirements

– Solves problem for customers without hot and cold aisles

– Room neutral – does not require airflow analysis– No electrical connections

Rear door cooling

Presenter
Presentation Notes
…so it’s really simple to deploy.
Page 27: High Density Cooling

© 2010 Emerson Network Power

Cas

cade

Effe

ct

Cas

cade

Effe

ct

Customer layout

Presenter
Presentation Notes
…very simple as a cooling solution.
Page 28: High Density Cooling

© 2010 Emerson Network Power

Liebert XDS configuration• A cooling rack which uses the Clustered

Systems cooling technology to move heat directly from the server to the Liebert XD pumped refrigerant system

• Heat is never transferred to the air• Provides cooling for 36 1U servers• Available initially in 20kW capacity with a 40kW

rack in the futureBenefits• No cooling module fans or server fans• 8% to 10% smaller IT power requirements• Chill Off 2 tested at 80 F fluid temperature

result in effective PUE<1• Next opportunity: XDP connection to a cooling

tower without a chiller

Direct server cooling without server fans

CPU

0

CPU

1

Presenter
Presentation Notes
…with higher fluid temperatures.
Page 29: High Density Cooling

© 2010 Emerson Network Power

Energy consumption comparisons

Equivalent PUE<1

Presenter
Presentation Notes
… very, very reliable cooling all the time.
Page 30: High Density Cooling

© 2010 Emerson Network Power

Industry evolving technology to improve data center cooling efficiency

Presenter
Presentation Notes
…Turn it back to you Thom.
Page 31: High Density Cooling

Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for

the United States Department of Energy’s National Nuclear Security Administration under contract DE-

AC04-94AL85000

Sandia National Laboratories’ Energy Efficient Red Sky Design

David J. MartinezFacilities CoordinatorCorporate Computing FacilitiesSandia National Laboratories

Presenter
Presentation Notes
Thanks Steve. Now I’d like to hand things over to David Martinez, Facilities Coordinator in the Sandia National Laboratories Corporate Computing Facilities. During his more than 25 years with Sandia Labs, David has worked in a variety of capacities. During his tenure, David has seen the data center operations move from about 20,000 sq. ft. to over 77,000 sq. ft., comprised of 4 unique data center environments. David, thanks for joining us today to discuss how you implemented some of the high density strategies Steve discussed with the deployment of Sandia’s new “Red Sky” super-computer. …I’ll just hop right into it here.
Page 32: High Density Cooling

© 2010 Emerson Network Power

What happened to consolidation?

• Addresses critical national need for High Performance Computing (HPC)

• Replaces aging current HPC system

Why build Red Sky?

Presenter
Presentation Notes
…so we went in search of a new computer.
Page 33: High Density Cooling

© 2010 Emerson Network Power

140 Racks (Large) 36 Racks (Small)

50 Teraflops Total 10 Teraflops per rack

~13 kW / rack full load ~32 kW / rack full load

518 tons cooling 328 tons cooling

12.7M gal water per yr 7.3M gal water per yr

RED SKYTHUNDERBIRD

System comparison

Presenter
Presentation Notes
…computer and more efficient computer.
Page 34: High Density Cooling

© 2010 Emerson Network Power

System design implemented three new technologies for power and cooling:

Modular Power Distribution Units

Liebert XDP Refrigerant Cooling Units

Glacier Doors

The facility had to be prepared…..

Red Sky and new technologies

Presenter
Presentation Notes
....so this is how we did it:
Page 35: High Density Cooling

© 2010 Emerson Network Power

3.5 months Zero accidents 0.5 miles copper 650 brazed connections 400 ft. carbon steel 140 welded connections

Facility preparation

Presenter
Presentation Notes
….and about 140 welded connections.
Page 36: High Density Cooling

© 2010 Emerson Network Power

Facility preparation

Presenter
Presentation Notes
….running at 40 hertz, 40-50 hertz.
Page 37: High Density Cooling

© 2010 Emerson Network Power

Facility preparation

Presenter
Presentation Notes
…about 10% gas to get that liquid back to the XDP units.
Page 38: High Density Cooling

© 2010 Emerson Network Power

Facility preparation

Presenter
Presentation Notes
…a very clean intallation.
Page 39: High Density Cooling

© 2010 Emerson Network Power

Page 40: High Density Cooling

© 2010 Emerson Network Power

XDP Refrigerant UnitsPumping unit that serves as an isolating interface between the building chilled water system and the pumped refrigerant (134A) circuit

• Operates above dew point• No compressor• Power used to cool computer not

dehumidify • 100% sensible at 0.13 kW per kW cooling

New cooling solutions

Glacier DoorsFirst rack-mounted, refrigerant-based passive cooling system on the market

Presenter
Presentation Notes
…safe thing-- very, very effective.
Page 41: High Density Cooling

© 2010 Emerson Network Power

How it works

90% heat load removed with Liebert XDP-Glacier Door combination

Uses Laminar Air Flow concept

Perforated tiles only needed in 1st row

The total cooling solution

Presenter
Presentation Notes
…and it passes from there.
Page 42: High Density Cooling

© 2010 Emerson Network Power

Laminar air flow

Presenter
Presentation Notes
…time of trouble, if you did have it.
Page 43: High Density Cooling

© 2010 Emerson Network Power

Chiller Plant Liebert

XDP

45o

Glacier Door

How it all comes together

Presenter
Presentation Notes
…basically how it works.
Page 44: High Density Cooling

© 2010 Emerson Network Power

Chiller Plant

Plate Frame Heat

Exchanger

Plug Fan CRAC Unit

53o

April to Sept.

61o

Oct .to March

52o

Chiller0.46 kW power = 1 ton cooling

Plate Frame Heat Exchanger0.2 kW power = 1 ton cooling

How it all comes together

LiebertXDP

Presenter
Presentation Notes
….with our climate here.
Page 45: High Density Cooling

© 2010 Emerson Network Power

Comparison of compute power and footprint

Presenter
Presentation Notes
…have advanced in this slide.
Page 46: High Density Cooling

© 2010 Emerson Network Power

Tons of cooling used

Presenter
Presentation Notes
….Red Sky in the red.
Page 47: High Density Cooling

© 2010 Emerson Network Power

Annual water consumption

Presenter
Presentation Notes
…just by the low tonnage usage.
Page 48: High Density Cooling

© 2010 Emerson Network Power

Red Sky ThunderbirdCO2E 203 912Footprint 46 205

0

50

100

150

200

250

0

100

200

300

400

500

600

700

800

900

1,000

gha

Tonn

es

Carbon footprint

Presenter
Presentation Notes
…this is the slide they produced.
Page 49: High Density Cooling

© 2010 Emerson Network Power

Thunderbird (518 Tons of

Cooling)

Red Sky (328 Tons of

Cooling)kWh Cost $1,324,222 $838,504Kilowatt Hours 15,954,483 10,102,452

0

2,000,000

4,000,000

6,000,000

8,000,000

10,000,000

12,000,000

14,000,000

16,000,000

18,000,000

$0

$200,000

$400,000

$600,000

$800,000

$1,000,000

$1,200,000

$1,400,000

Kilo

wat

t Hou

rs p

er Y

ear

Kilo

wat

t Hou

rs c

ost p

er Y

ear

37% Reduction

Chiller plant power consumption and cost

Presenter
Presentation Notes
…huge step in the right direction with this install.
Page 50: High Density Cooling

© 2010 Emerson Network Power

Thunderbird (21 CRACs)

Red Sky (12 XDPs & 3 CRACs)

kWh Cost $126,791 $28,256Kilowatt Hours 1,527,604 340,437

0

200,000

400,000

600,000

800,000

1,000,000

1,200,000

1,400,000

1,600,000

1,800,000

$0

$20,000

$40,000

$60,000

$80,000

$100,000

$120,000

$140,000

Kilo

wat

t Hou

rs U

sed

per Y

ear

Kilo

wat

t Hou

rs C

ost p

er Y

ear

77% Reduction

Energy consumption

Presenter
Presentation Notes
…so back to you Thom.
Page 51: High Density Cooling

© 2010 Emerson Network Power

Q & A

Steve Madara, Vice President and General Manager, Liebert North America Precision Cooling, Emerson Network Power

David J. Martinez, Facilities Coordinator, Corporate Computing Facilities, Sandia National Laboratories

Thank you for joining us!• Look for more webcasts coming this fall!• Follow @EmrsnNPDataCntr on Twitter

Presenter
Presentation Notes
Thom: Thanks David. At this time, Steve and David will be happy to take your questions. We can still accept your questions, which you can submit by clicking on the questions tab and typing your question in the box. I’d also like to remind everyone that the first 50 people to register and fill out the brief survey at the end of our Web cast will receive a two-in-one laser pointer integrated with a 1-gigabyte USB drive, so be sure to look for the survey in a few minutes. Our first question comes from … BEGIN LIVE Q&A ….We’ll need to wrap up the questions now, but Steve and David will be answering the questions we didn’t have time for via e-mail. As we close today, I’m pleased to announce that our webcast series is being extending into the fall! Keep your eyes out for an invitation to our next webcast, where we’ll be debuting the results of a comprehensive study conducted by Emerson Network Power, that will examine the most frequent causes of data center downtime, and also, the *true costs incurred* when data center operations go down. One final note: the survey will be presented as a "pop up." To make sure your pop-up blocker if off for this effort, you’ll want to press and hold the "control" key.   Go ahead and press that now.  On behalf of Emerson Network Power. I’d like to thank you all for your time and attention today. Sign Off