You are on page 1of 43

Innovative Best Practice Data Centre Cooling

Dr. Peter Koch - Sr. VP Engineering and Product Management Knrr Racks & Solutions Emerson Network Power Systems EMEA

How do you realize that you are in an energy efficient data center?

by Emerson Network Power March 2012

No Datacenter is Like Another

by Emerson Network Power September 2012

Key Requirements Are The Same

Availability - Capacity - Efficiency TCO - Agility

by Emerson Network Power September 2012

Cooling

by Emerson Network Power September 2012

Cooling Needs of IT Equipment


25C 25C 25C T 7K T 15K T 25K 32C 40C 470 m/h per kW 220 m/h per kW

50C

130 m/h per kW

Temperature and humidity at equipment intake: ASHRAE TC 9.9 Thermal Guidelines 2008 / modified 2011 for Class A1*) Recommended: 18 - 27C, 5,5C DP - 60% RH and 15C DP Allowable: 15 - 32C, 20% RH - 80% RH (Class A1) 10 - 35C, 20% RH - 80% RH (Class A2)
*) Quelle: 2011 Thermal Guidelines for Data Center Processing Environments Expanded Data Center Classes and Usage Guidance (ASHRAE Whitepaper)

by Emerson Network Power September 2012

Server Power Draw vs. Intake Temperature


Power Dra Ratio

1,20
1,15

1,10
ca. + 12% 1,05 1,00 15

20

25

30

35C

Air Intake Temperature


Quelle: 2011 Thermal Guidelines for Data Center Processing Environments Expanded Data Center Classes and Usage Guidance (ASHRAE Whitepaper)

by Emerson Network Power September 2012

Standard Room Cooling System

35 - 50C

IT Room

Cooling Plant
25C

by Emerson Network Power September 2012

Cooling Alternatives

chilled water (CW) for larger sites compressed refrigerant (DX) for smaller sites

35 - 50C

air?
25C

Pair movement + PIT Equipment

by Emerson Network Power September 2012

Outside Air: Too Hot - Too Cold


(+ too humid - too dry, too dusty, polluted)
50C 45C 40C 35C
ambient temperature range 32C

Class A1

25C 20C 15C 10C

27C

18C

allowed envelope recommended envelope


*) Quelle: 2011 Thermal Guidelines for Data Center Processing Environments Expanded Data Center Classes and Usage Guidance (ASHRAE Whitepaper)

Class A2

30C

Server inlet temperature ASHRAE TC 9.9*

10

by Emerson Network Power September 2012

Direct Free Cooling = Fresh Air Cooling?


Temperature not appropriate most of the time

heating / hot air recirculation in winter


additional cooling in summer Humidification / dehumidification required (energy consumption!)

Filtering required (energy consumption, maintenance)


Gaseous pollutants can not be filtered corrosion risk! Large air ducts required, strong restrictions to building design Difficult to protect against intruders / vandalism / terrorism Difficult to protect against fire

11

by Emerson Network Power September 2012

Indirect Free Cooling Water Side Economizer


ambient IT Room

max. 15C - dry bulb - wet bulb

20C 25C e.g.

chilled water intermediate circuit

Inside air is kept separate from outside air. Heat transport by chilled water enables flexible building design

and a high level of protection.


About 10C are "lost" by the heat exchangers. Refrigeration required when outside temperature is to high
12

by Emerson Network Power September 2012

Air-to-Air Heat Exchangers Heat Wheels and Such Like


Indirect free cooling
Don't do any better! Take valuable space.

Complicate building design.

4 x 300 m IT-space on 2 floors

4 x 470 m 4 x 470 qm ITIT-space on Flche 2 floors auf 2 Geschossen

13

by Emerson Network Power September 2012

Capacity

14

by Emerson Network Power September 2012

A Brief History of Cooling Painpoints


2002: Heat Density Closed Watercooled Racks

From 2 to 30 kW per rack and the heat goes on!


15
by Emerson Network Power September 2012

Cooling Solutions - Capacity Ranges


Room w/o with containment Row Rack Server
chip - cooling cold plate

closed

open

kW per rack < 5 efficiency -

around 10 +

around 15 +

around 25 +

around 30 ++

up to 75 +++

16

by Emerson Network Power September 2012

Power Density Matters

4 kW per rack 80% empty !

17

by Emerson Network Power September 2012

Efficiency Availability

18

by Emerson Network Power September 2012

A Brief History of Cooling Painpoints


2007: Energy Efficiency Aisle Containments

2% 7% 8%

2% IT Load Pow er Cooling Pow er Fans Electrical Losses 80% Pow er Light

PUE = 1.25

11% 12%

3% IT Load Pow er Cooling 50% Pow er Fans Electrical Losses

PUE = 2.0

25%

Pow er Light

From 2.0 to 1.25 it can be done even better!


19
by Emerson Network Power September 2012

PUE - How to Measure it Correctly


medium voltage transformer genset
here if possible

Cooling Plant

Total Facility Power PUE = IT Equipment Power

Total Facility Power


or here

PDU UPS PDU CRAC

Power Usage Effectiveness

IT Equipment Power

make sure to capture all electrical losses fans, pumps, (cooling) may be on UPS rack fans dont count as IT load

metered rack PDUs

one full year averaging


WUE for water consumption
20

by Emerson Network Power September 2012

Power is Money
1 MW for 1 year
1 Mio. 5.000 t CO2
8,76 Mio. kWh @ 0.12 /kWh German power mix

21

by Emerson Network Power September 2012

Room Cooling: Cold Aisle Containment

22

by Emerson Network Power September 2012

Control Principle

p 0

23

by Emerson Network Power September 2012

Huge Savings From Fan Speed Control


EC fans speed control
HPM L15EC (EC Fans)

9,00 8,00 7,00 6,00 5,00 4,00 3,00 2,00

Power draw [kW]

Fan speed control bears enormous savings

1/2 speed 1/8 power

"sweet spot"
% fan speed & air flow
100% 33% 50% 75%

1,00 0,00

24

by Emerson Network Power September 2012

Containment Benefits
Capacity more power / heat density per rack more cooling capacity per cooling unit suitable for greenfield and retrofit installations Efficiency optimized floor space utilization utmost energy efficiency by system control "from rack to roof" easy upgrade of existing sites Availability no hot spots resilient response against cooling system failures predictable system behavior controllable capacity limits easy to monitor / manage / operate

25

by Emerson Network Power September 2012

Containment Benefits
TCO huge energy cost and floor space savings upgrade amortization typically between 6 and 18 months simplified design process easy deployment and installation based on standard components from industrial production Agility easy deployment of IT components dynamic system control early identification of capacity limits modular expandability customizable containment designs for virtually any situation

26

by Emerson Network Power September 2012

Alternative Setups

27

by Emerson Network Power September 2012

The Mothers of Efficiency


Consistent cold/hot air separation Dynamc system control from rack to roof High temperature level, free cooling Intelligent utilization of redundancies Modularity

28

by Emerson Network Power September 2012

The Classical Approach Still Going Strong !


-20 ... +40C

Highly efficient components (EC fans, digital scroll compressor, large coils ) Aisle containment Fan speed control High temperature level Extensive utilization of indirect free cooling Optimized system, dynamic system control "from rack to roof" ASHRAE recommended envelope all year round PUE better then 1.25 achievable
35 - 50C

20C

25C
25C

29

29

by Emerson Network Power September 2012

Modularity
Racks, Rows
Aisles

Containers, Pods
Buildings Rooms / 300 500 sqm In-room modularity

30

by Emerson Network Power September 2012

In-Room Modularity

31

by Emerson Network Power September 2012

In-Room Modularity

32

by Emerson Network Power September 2012

In-Room Modularity

33

by Emerson Network Power September 2012

In-Room Modularity

34

by Emerson Network Power September 2012

In-Room Modularity

35

by Emerson Network Power September 2012

A Brief History of Cooling Painpoints


2012: Hot Side Temperature

From 35 to 50C and the heat goes on!


36
by Emerson Network Power September 2012

Hot Side Temperature Up to 50C


50C 45C 40C 35C Class A1
27C 32C

Exhaust temperature range High-end blade servers full load idling

25C 20C 15C 10C

18C

allowed envelope recommended envelope

Class A2

30C

Server inlet temperature range ASHRAE TC 9.9*

*) Quelle: 2011 Thermal Guidelines for Data Center Processing Environments Expanded Data Center Classes and Usage Guidance (ASHRAE Whitepaper)

37

by Emerson Network Power September 2012

What Can Be Done?


Excess air supply, cooling of hot zones
Reduced cold air temperature Gliding operation mode

Efficiency !
Efficiency ! Helps temporarily

Enclose the hot air

Rack Cooling
Row Cooling Hot Floor concept

38

by Emerson Network Power September 2012

Closed Rack Cooling


Closed watercooled serverracks 12 - 30 kW nominal cooling Vertical or horizontal air flow

39

by Emerson Network Power September 2012

Open Rack Cooling


Open watercooled serverrack

35 kW cooling capacity
Passive rear door heat exchanger Air movement by server fans

Currently the most efficient solution

Goethe University Frankfurt / Germany

40

by Emerson Network Power September 2012

In The Row Cooling


Water cooled units

30 kW nominal capacity
One or more per row Closed or "hybrid"

Solutions from 6 to 60 kW per rack

41

by Emerson Network Power September 2012

The Hot Floor Concept


3

4
2 5 5

1 7

42

by Emerson Network Power September 2012

How do you realize that you are in an energy efficient data center?

43

by Emerson Network Power March 2012

You might also like