SlideShare una empresa de Scribd logo
1 de 55
Descargar para leer sin conexión
NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.
US	
  Trends	
  in	
  Data	
  Centre	
  Design	
  with	
  
NREL	
  Examples	
  of	
  Large	
  Energy	
  Savings	
  
Understanding	
  and	
  Minimising	
  
The	
  Costs	
  of	
  Data	
  Centre	
  
Based	
  IT	
  Services	
  Conference	
  
	
  
University	
  of	
  Liverpool	
  
O?o	
  Van	
  Geet,	
  PE	
  
June	
  17,	
  2013	
  
	
  
	
  
	
  
2
0"
5"
10"
15"
20"
25"
30"
35"
40"
1.00"
1.03"
1.06"
1.09"
1.12"
1.15"
1.18"
1.21"
1.24"
1.27"
1.30"
1.33"
1.36"
1.39"
1.42"
1.45"
1.48"
1.51"
1.54"
1.57"
1.60"
1.63"
1.66"
1.69"
1.72"
1.75"
1.78"
1.81"
1.84"
1.87"
1.90"
1.93"
Cost%in%Millions%of%Dollars%
P.U.E.%
Total%Annual%Electrical%Cost:%Compute%+%Facility%
2
Assume ~20MW HPC system & $1M per MW year utility cost
Facility
HPC
Cost	
  and	
  Infrastructure	
  Constraints	
  
3
BPG	
  Table	
  of	
  Contents	
  
•  Summary	
  
•  Background	
  
•  Informa?on	
  Technology	
  
Systems	
  
•  Environmental	
  CondiLons	
  
•  Air	
  Management	
  
•  Cooling	
  Systems	
  
•  Electrical	
  Systems	
  
•  Other	
  Opportuni?es	
  for	
  
Energy	
  Efficient	
  Design	
  
•  Data	
  Center	
  Metrics	
  &	
  
Benchmarking	
  
4
CPUs	
  
~65C	
  
(149F)	
  
GPUs	
  
~75C	
  
(167F)	
  Memory	
  
~85C	
  
(185F)	
  
CPU,	
  GPU	
  &	
  Memory,	
  represent	
  ~75-­‐90%	
  of	
  heat	
  load	
  …	
  
s
s
Safe	
  Temperature	
  Limits	
  
5
	
  Data	
  Center	
  equipment’s	
  environmental	
  condiLons	
  should	
  fall	
  
within	
  the	
  ranges	
  established	
  by	
  ASHRAE	
  as	
  published	
  in	
  the	
  
Thermal	
  Guidelines	
  book.	
  	
  
Environmental	
  CondiLons	
  
ASHRAE	
  Reference:	
  ASHRAE	
  (2008),	
  (2011)	
  	
  
(@	
  Equipment	
  Intake) Recommended Allowable
Temperature	
  
	
  	
  	
  Data	
  Centers	
  ASHRAE	
  	
  
18°	
  –	
  27°C
	
  
15°	
  –	
  32°C	
  (A1)	
  
5°	
  –	
  45°C	
  (A4)	
  
Humidity	
  (RH)	
  
	
  	
  Data	
  Centers	
  ASHRAE	
  	
  	
  
5.5°C	
  DP	
  –	
  	
  
60%	
  	
  RH	
  and	
  
15oC	
  DP	
  
20%	
  –	
  
80%	
  RH
Environmental	
  SpecificaLons	
  (°C)	
  
6
2011	
  ASHRAE	
  Allowable	
  Ranges	
  
Dry Bulb Temperature
7
Psychrometric	
  Bin	
  Analysis	
  
0
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
0.02
0.022
0.024
0.026
0.028
0.03
0.032
0.034
0.036
30 40 50 60 70 80 90 100 110 120
HumidityRatio(lbWater/lbDryAir)
Dry-Bulb Temperature (ºF)
Boulder, Colorado TMY3 Weather Data
TMY3 Weather Data
Class 1 Recommended Range
Class 1 Allowable Range
60ºF
50ºF
40ºF
RelativeHumidity
80%
60%
40%
20%
100%
70ºF
60ºF
50ºF
40ºF
RelativeHumidity
80%
60%
40%
20%
100%
80ºF
Design	
  Condi?ons	
  (0.4%):	
  
91.2	
  db,	
  60.6	
  wb	
  
8
EsLmated	
  Savings	
  
Baseline	
  
System	
   DX	
  Cooling	
  with	
  no	
  economizer	
  
Load	
   1	
  ton	
  of	
  cooling,	
  constant	
  year-­‐round	
  
Efficiency	
  (COP)	
   3	
  
Total	
  Energy	
  (kWh/yr)	
   10,270	
  
RECOMMENDED	
  RANGE	
   ALLOWABLE	
  RANGE	
  
Results	
  
Hours	
   Energy	
  (kWh)	
   Hours	
   Energy	
  (kWh)	
  
Zone1:	
  	
  DX	
  Cooling	
  Only	
   25	
   8	
   2	
   1	
  
Zone2:	
  	
  Mul?stage	
  Indirect	
  Evap.	
  +	
  DX	
  (H80)	
   26	
   16	
   4	
   3	
  
Zone3:	
  	
  Mul?stage	
  Indirect	
  Evap.	
  Only	
   3	
   1	
   0	
   0	
  
Zone4:	
  	
  Evap.	
  Cooler	
  Only	
   867	
   97	
   510	
   57	
  
Zone5:	
  	
  Evap.	
  Cooler	
  +	
  Outside	
  Air	
   6055	
   417	
   1656	
   99	
  
Zone6:	
  	
  Outside	
  Air	
  	
  Only	
   994	
   0	
   4079	
   0	
  
Zone7:	
  	
  100%	
  Outside	
  Air	
   790	
   0	
   2509	
   0	
  
Total	
   8,760	
   538	
   8,760	
   160	
  
Es0mated	
  %	
  Savings	
   -­‐	
   95%	
   -­‐	
   98%	
  
9
Data	
  Center	
  Efficiency	
  Metric	
  
•  Power	
  Usage	
  EffecLveness	
  (P.U.E.)	
  is	
  an	
  industry	
  standard	
  
data	
  center	
  efficiency	
  metric.	
  
•  The	
  raLo	
  of	
  power	
  used	
  or	
  lost	
  by	
  data	
  center	
  facility	
  
infrastructure	
  (pumps,	
  lights,	
  fans,	
  conversions,	
  UPS…)	
  to	
  
power	
  used	
  by	
  compute.	
  
•  Not	
  perfect,	
  some	
  folks	
  play	
  games	
  with	
  it.	
  
•  2011	
  survey	
  esLmates	
  industry	
  average	
  is	
  1.8.	
  
•  Typical	
  data	
  center,	
  half	
  of	
  power	
  goes	
  to	
  things	
  other	
  than	
  
compute	
  capability.	
  
9
“IT power” + “Facility power”
P.U.E. =
“IT power”
10
PUE	
  –	
  Simple	
  and	
  EffecLve	
  
11
-20
0
20
40
60
80
100
0.75
0.85
0.95
1.05
1.15
1.25
1.35
1.45
OutdoorTemperature(°F)
PUE
Data	
  Center	
  PUE	
  
-20
0
20
40
60
80
100
0.75
0.85
0.95
1.05
1.15
1.25
1.35
1.45
OutdoorTemperature(°F)
PUE
Data Center PUE
Outdoor Temperature
“I	
  am	
  re-­‐using	
  waste	
  
heat	
  from	
  my	
  data	
  
center	
  on	
  another	
  
part	
  of	
  my	
  site	
  and	
  
my	
  PUE	
  is	
  0.8!”	
  
ASHRAE	
  &	
  friends	
  (DOE,	
  EPA,	
  
TGG,	
  7x24,	
  etc..)	
  do	
  not	
  allow	
  
reused	
  energy	
  in	
  PUE	
  &	
  PUE	
  is	
  
always	
  >1.0.	
  
	
  
Another	
  metric	
  has	
  been	
  
developed	
  by	
  The	
  Green	
  Grid	
  +;	
  
ERE	
  –	
  Energy	
  Reuse	
  EffecLveness.	
  
h?p://www.thegreengrid.org/en/Global/Content/white-­‐papers/ERE	
  
13
ERE	
  –	
  Adds	
  Energy	
  Reuse	
  
Utility
Cooling
UPS PDU
IT
Rejected
Energy
(a)
(b)
(c) (d)
(f)
(e)
Reused
Energy
(g)
14
Credit:	
  Haselden	
  ConstrucLon	
  
•  More	
  than	
  1300	
  people	
  in	
  DOE	
  
office	
  space	
  on	
  NREL’s	
  campus	
  
•  33,445	
  	
  m2	
  	
  
•  Design/build	
  process	
  with	
  
required	
  energy	
  goals	
  	
  
̶ 	
  	
  	
  	
  50%	
  energy	
  savings	
  from	
  code	
  
̶ 	
  	
  	
  	
  LEED	
  Pla?num	
  
•  Replicable	
  
̶  Process	
  	
  
̶  Technologies	
  
̶  Cost	
  
•  Site,	
  source,	
  carbon,	
  cost	
  ZEB:B	
  
̶ 	
  	
  	
  	
  Includes	
  plugs	
  loads	
  and	
  datacenter	
  
•  Firm	
  fixed	
  price	
  	
  -­‐	
  	
  US	
  $22.8/m2	
  
construcLon	
  cost	
  (not	
  including	
  
$2.5/m2	
  for	
  PV	
  from	
  PPA/
ARRA)	
  
•  Opened	
  June	
  10,	
  2010	
  (First	
  
Phase)	
  
DOE/NREL	
  Research	
  Support	
  Facility	
  	
  
15
RSF	
  Datacenter	
  
•  Fully	
  containing	
  hot	
  aisle	
  
–  Custom	
  aisle	
  floor	
  and	
  door	
  seals	
  
–  Ensure	
  equipment	
  designed	
  for	
  
cold	
  
aisle	
  containment	
  	
  
§  And	
  installed	
  to	
  pull	
  cold	
  air	
  
Ø  Not	
  hot	
  air…	
  
–  	
  	
  	
  1.18	
  annual	
  PUE	
  	
  
–  	
  	
  	
  	
  ERE	
  =	
  0.9	
  	
  
•  Control	
  hot	
  aisle	
  based	
  on	
  
return	
  temperature	
  of	
  ~90F.	
  
•  Waste	
  heat	
  used	
  to	
  heat	
  building.	
  
•  Outside	
  air	
  and	
  EvaporaLve	
  
Cooling	
  
•  Low	
  fan	
  energy	
  design	
  
•  176	
  Sq	
  m.	
  
Credit:	
  Marjorie	
  Scho?/NREL	
  
16
16
17
17
Data Center Load GROWTH (40+ kW in 2 years) since NO recharge!
18
Move	
  to	
  Liquid	
  Cooling	
  
•  Server	
  fans	
  are	
  inefficient	
  and	
  noisy.	
  
–  Liquid	
  doors	
  are	
  an	
  improvement	
  but	
  we	
  can	
  do	
  
beger!	
  
•  Power	
  densiLes	
  are	
  rising	
  making	
  component-­‐
level	
  liquid	
  cooling	
  soluLons	
  more	
  appropriate.	
  
•  Liquid	
  Benefit	
  
–  Thermal	
  stability,	
  reduced	
  component	
  failures.	
  
–  Beger	
  waste	
  heat	
  re-­‐use	
  op?ons.	
  
–  Warm	
  water	
  cooling,	
  reduce/eliminate	
  
condensa?on.	
  
–  Provide	
  cooling	
  with	
  higher	
  temperature	
  coolant.	
  
•  Eliminate	
  expensive	
  &	
  inefficient	
  chillers.	
  
•  Save	
  wasted	
  fan	
  energy	
  and	
  use	
  it	
  for	
  
compuLng.	
  
•  Unlock	
  your	
  cores	
  and	
  overclock	
  to	
  increase	
  
throughput!	
  
19
Liquid	
  Cooling	
  –	
  Overview	
  
Water	
  and	
  other	
  liquids	
  (dielectrics,	
  glycols	
  and	
  refrigerants)	
  
may	
  be	
  used	
  for	
  heat	
  removal.	
  	
  
•  Liquids	
  typically	
  use	
  LESS	
  transport	
  energy	
  	
  
(14.36	
  Air	
  to	
  Water	
  Horsepower	
  ra?o	
  for	
  example	
  below).	
  	
  
•  Liquid-­‐to-­‐liquid	
  heat	
  exchangers	
  have	
  closer	
  approach	
  temps	
  than	
  
Liquid-­‐to-­‐air	
  (coils),	
  yielding	
  increased	
  outside	
  air	
  hours.	
  
20
2011	
  ASHRAE	
  Liquid	
  Cooling	
  Guidelines	
  
NREL	
  ESIF	
  HPC	
  (HP	
  hardware)	
  using	
  24	
  C	
  supply,	
  40	
  C	
  return	
  –W4/W5	
  
21
NREL	
  HPC	
  Data	
  Center	
  
Showcase	
  Facility	
  
•  10MW,	
  929	
  m2	
  	
  
•  Leverage	
  favorable	
  climate	
  
•  Use	
  direct	
  water	
  to	
  rack	
  
cooling	
  
•  DC	
  manager	
  responsible	
  for	
  
ALL	
  DC	
  cost	
  including	
  energy!	
  
•  Waste	
  heat	
  captured	
  and	
  used	
  
to	
  heat	
  labs	
  &	
  offices.	
  
•  World’s	
  most	
  energy	
  efficient	
  
data	
  center,	
  PUE	
  1.06!	
  
•  Lower	
  CapEx	
  and	
  OpEx.	
  
Leveraged	
  exper0se	
  in	
  energy	
  efficient	
  
buildings	
  to	
  focus	
  on	
  showcase	
  data	
  center.	
  
Chips to bricks approach
•  Opera?onal	
  1-­‐2013,	
  Petascale+	
  
HPC	
  Capability	
  in	
  8-­‐2013	
  
•  20-­‐year	
  planning	
  horizon	
  
̶  5	
  to	
  6	
  HPC	
  genera?ons.	
  
High	
  
Performance	
  
CompuLng	
  
22
CriLcal	
  Data	
  Center	
  Specs	
  
•  Warm	
  water	
  cooling,	
  24C	
  
̶  Water	
  much	
  beger	
  working	
  fluid	
  than	
  air	
  
-­‐	
  pumps	
  trump	
  fans.	
  
̶  U?lize	
  high	
  quality	
  waste	
  heat,	
  40C	
  or	
  
warmer.	
  
̶  +90%	
  IT	
  heat	
  load	
  to	
  liquid.	
  
•  High	
  power	
  distribuLon	
  
̶  480VAC,	
  Eliminate	
  conversions.	
  
•  Think	
  outside	
  the	
  box	
  
̶  Don’t	
  be	
  sa?sfied	
  with	
  an	
  energy	
  efficient	
  
data	
  center	
  nestled	
  on	
  campus	
  
surrounded	
  by	
  inefficient	
  laboratory	
  and	
  
office	
  buildings.	
  
̶  Innovate,	
  integrate,	
  op?mize.	
  
Dashboards	
  report	
  instantaneous,	
  seasonal	
  and	
  cumulaLve	
  PUE	
  values.	
  
23
•  Data	
  center	
  equivalent	
  of	
  the	
  “visible	
  man”	
  
–  Reveal	
  not	
  just	
  boxes	
  with	
  blinky	
  lights,	
  but	
  the	
  inner	
  workings	
  of	
  
the	
  building	
  as	
  well.	
  
–  Tour	
  views	
  into	
  pump	
  room	
  and	
  mechanical	
  spaces	
  
–  Color	
  code	
  pipes,	
  LCD	
  monitors	
  
NREL	
  ESIF	
  Data	
  Center	
  Cross	
  SecLon	
  
24
•  2.5 MW – Day one
capacity (Utility $500K/
yr/MW)
•  10 MW – Ultimate
Capacity
•  Petaflop
•  No Vapor Compression
for Cooling
Data Center
25
Summer Cooling Mode
PUE –
Typical Data Center =
1.5 – 2.0
NREL ESIF= 1.04
* 30% more energy
efficient than your
typical “green” data
center
Data Center
26
Winter Cooling Mode
ERE – Energy Reuse
Effectiveness
How efficient are we
using the waste heat to
heat the rest of the
building?
NREL ESIF= .7 (we use
30% of waste heat)
(more with future campus
loops)
Future Campus
Heating Loop
Future
Campus
Heating
Loop
High Bay
Heating
Loop
Office
Heating
Loop
Conference
Heating
Loop
Data Center
27
95 deg
Air
75 deg
Air
• Water to rack Cooling for High Performance
Computers handles 90% of total load
• Air Cooling for Legacy Equipment handles 10% of total Load
Data Center – Cooling Strategy
28
	
  	
  	
  	
  	
  PUE	
  1.0X	
  -­‐-­‐	
  Focus	
  on	
  the	
  “1”	
  
Facility PUE
IT Power Consumption
Energy Re-use
We all know how to do this!
True efficiency requires 3-D optimization.
29
Facility PUE
IT Power Consumption
Energy Re-use
We all know how to do this!
Increased work per watt
Reduce or eliminate fans
Component level heat exchange
Newest processors are more efficient.
True efficiency requires 3-D optimization.
	
  	
  	
  	
  	
  PUE	
  1.0X	
  -­‐-­‐	
  Focus	
  on	
  the	
  “1”	
  
30
Facility PUE
IT Power Consumption
Energy Re-use
True efficiency requires 3-D optimization.
We all know how to do this!
Increased work per watt
Reduce or eliminate fans
Component level heat exchange
Newest processors are more efficient.
Direct liquid cooling,
Higher return water temps
Holistic view of data center
planning
	
  	
  	
  	
  PUE	
  1.0X	
  -­‐-­‐	
  Focus	
  on	
  the	
  “1”	
  
31
What’s	
  Next?	
  
ü  Energy	
  Efficient	
  supporLng	
  infrastructure.	
  
ü  Pumps,	
  large	
  pipes,	
  high	
  voltage	
  (380	
  to	
  480)	
  	
  electrical	
  to	
  rack	
  
ü  Efficient	
  HPC	
  for	
  planned	
  workload.	
  
ü  Capture	
  and	
  re-­‐use	
  waste	
  heat.	
  
Can	
  we	
  manage	
  and	
  “opLmize”	
  workflows,	
  with	
  varied	
  job	
  mix,	
  
within	
  a	
  given	
  energy	
  “budget”?	
  
	
  
Can	
  we	
  do	
  this	
  as	
  part	
  of	
  a	
  larger	
  “ecosystem”?	
  
	
  
	
  
31 Steve Hammond
32
Other	
  Factors	
  
32 5
DemandSMART: Comprehensive Demand Response
Balancing supply and demand on the electricity grid is difficult and expensive. End users
that provide a balancing resource are compensated for the service.
Annual Electricity Demand As a Percent of Available Capacity
50%
100%
Winter Spring Summer Fall
75%
25%
90%
4MW solar
Use waste heat
Better rates, shed load
DC as part of Campus Energy System
33
ParLng	
  Thoughts	
  
•  Energy Efficient Data Centers – been there, done that
–  We know how, let’s just apply best practices.
–  Don’t fear H20: Liquid cooling will be increasingly prevalent.
•  Metrics will lead us into sustainability
–  If you don’t measure/monitor it, you can’t manage it.
–  As PUE has done; ERE, Carbon Use Effectiveness (CUE), etc. will help drive
sustainability.
•  Energy Efficient and Sustainable Computing – it’s all about the “1”
–  1.0 or 0.06? Where do we focus? Compute & Energy Reuse.
•  Holistic approaches to Energy Management.
–  Lots of open research questions.
–  Projects may get an energy allocation rather than a node-hour allocation.
34
Otto VanGeet
303.384.7369
Otto.VanGeet@nrel.gov
	
  	
  
NREL	
  RSF	
  
50%	
  of	
  code	
  energy	
  use	
  
Net	
  zero	
  annual	
  energy	
  
$22.8/m2	
  	
  ConstrucLon	
  Cost	
  	
  
QUESTIONS?	
  
35
•  Thermoelectric	
  power	
  generaLon	
  (coal,	
  oil,	
  
natural	
  gas	
  and	
  nuclear)	
  consumes	
  about	
  1.1	
  
gallon	
  per	
  kW	
  hour,	
  on	
  average.	
  
•  This	
  amounts	
  to	
  about	
  9.6	
  M	
  gallons	
  per	
  MW	
  
year.	
  
•  We	
  esLmate	
  about	
  2.5	
  M	
  gallons	
  water	
  
consumed	
  per	
  MW	
  year	
  for	
  on-­‐site	
  evaporaLve	
  
cooling	
  towers	
  at	
  NREL.	
  
•  If	
  chillers	
  need	
  0.2MW	
  per	
  MW	
  of	
  HPC	
  power,	
  
then	
  chillers	
  have	
  an	
  impact	
  of	
  2.375M	
  gallons	
  
per	
  year	
  per	
  MW.	
  
•  Actuals	
  will	
  depend	
  on	
  your	
  site,	
  but	
  evap.	
  
cooling	
  doesn’t	
  necessarily	
  result	
  in	
  a	
  net	
  
increase	
  in	
  water	
  use.	
  	
  
•  Low	
  Energy	
  use	
  =	
  Lower	
  water	
  use.	
  Energy	
  Reuse	
  
uses	
  NO	
  water!	
  
Water	
  ConsideraLons	
  
“We shouldn’t use evaporative cooling, water is scarce.”
NREL PIX 00181
36
Data	
  Center	
  Efficiency	
  
•  Choices regarding power, packaging, cooling, and energy
recovery in data centers drive TCO.
•  Why should we care?
•  Carbon footprint.
•  Water usage.
•  Mega$ per MW year.
•  Cost: OpEx ~ IT CapEx!
•  A	
  less	
  efficient	
  data	
  center	
  takes	
  away	
  power	
  and	
  dollars	
  that	
  could	
  
otherwise	
  be	
  used	
  for	
  compute	
  capability.	
  
37
HolisLc	
  Thinking	
  
•  Approach	
  to	
  Cooling:	
  Air	
  vs	
  Liquid	
  and	
  where?	
  
–  Components,	
  Liquid	
  Doors	
  or	
  CRACs,	
  …	
  
•  What	
  is	
  your	
  “ambient”	
  Temperature?	
  
–  55F,	
  65F,	
  75F,	
  85F,	
  95F,	
  105F	
  …	
  
–  13C,	
  18C,	
  24C,	
  30C,	
  35C,	
  40.5C	
  …	
  
•  Electrical	
  distribuLon:	
  	
  
–  208v	
  or	
  480v?	
  
•  “Waste”	
  Heat:	
  	
  
–  How	
  hot?	
  	
  	
  Liquid	
  or	
  Air?	
  	
  	
  Throw	
  it	
  away	
  or	
  Use	
  it?	
  
38
Liquid	
  Cooling	
  –	
  New	
  ConsideraLons	
  
•  Air	
  Cooling	
  
–  Humidity	
  	
  	
  
–  Fan	
  failures	
  
–  Air	
  side	
  economizers,	
  par?culates	
  
•  Liquid	
  Cooling	
  
–  pH	
  &	
  bacteria	
  
–  Dissolved	
  solids	
  
–  Corrosion	
  inhibitors,	
  etc.	
  
•  When	
  considering	
  liquid	
  cooled	
  
systems,	
  insist	
  that	
  providers	
  adhere	
  
to	
  the	
  latest	
  ASHRAE	
  water	
  quality	
  
spec	
  or	
  it	
  could	
  be	
  costly.	
  
39
2011	
  ASHRAE	
  Liquid	
  Cooling	
  Guidelines	
  
40
2011	
  ASHRAE	
  Thermal	
  Guidelines	
  
2011	
  Thermal	
  Guidelines	
  for	
  Data	
  Processing	
  Environments	
  –	
  Expanded	
  Data	
  Center	
  Classes	
  and	
  
Usage	
  Guidance.	
  	
  White	
  paper	
  prepared	
  by	
  ASHRAE	
  Technical	
  Commi?ee	
  TC	
  9.9	
  
41
Energy	
  Savings	
  PotenLal:	
  Economizer	
  Cooling	
  
Energy	
  savings	
  poten?al	
  for	
  recommended	
  
envelope,	
  Stage	
  1:	
  Economizer	
  Cooling.12	
  
(Source:	
  Billy	
  Roberts,	
  NREL)	
  	
  
42
Data	
  Center	
  Energy	
  
•  Data	
  centers	
  are	
  energy	
  intensive	
  
faciliLes.	
  
–  10-­‐100x	
  more	
  energy	
  intensive	
  than	
  an	
  office.	
  
–  Server	
  racks	
  well	
  in	
  excess	
  of	
  30kW.	
  
–  Power	
  and	
  cooling	
  constraints	
  in	
  exis?ng	
  
facili?es.	
  
•  Data	
  Center	
  inefficiency	
  steals	
  power	
  that	
  
would	
  otherwise	
  support	
  compute	
  
capability.	
  
•  Important	
  to	
  have	
  DC	
  manager	
  
responsible	
  for	
  ALL	
  DC	
  cost	
  including	
  
energy!	
  
	
  
	
  
43
Energy	
  Savings	
  PotenLal:	
  	
  
Economizer	
  +	
  Direct	
  EvaporaLve	
  Cooling	
  
Energy	
  savings	
  poten?al	
  for	
  recommended	
  envelope,	
  
Stage	
  2:	
  Economizer	
  +	
  Direct	
  Evap.	
  Cooling.12	
  
(Source:	
  Billy	
  Roberts,	
  NREL)	
  	
  
44
Energy	
  Savings	
  PotenLal:	
  Economizer	
  +	
  Direct	
  
Evap.	
  +	
  MulLstage	
  Indirect	
  Evap.	
  Cooling	
  
Energy	
  savings	
  poten?al	
  for	
  recommended	
  
envelope,	
  Stage	
  3:	
  Economizer	
  +	
  Direct	
  
Evap.	
  +	
  Mul?stage	
  Indirect	
  Evap.	
  Cooling.12	
  
(Source:	
  Billy	
  Roberts,	
  NREL)	
  	
  
45
Data	
  Center	
  Energy	
  Efficiency	
  
•  ASHRAE	
  90.1	
  2011	
  requires	
  economizer	
  in	
  most	
  data	
  centers.	
  
•  ASHRAE	
  Standard	
  90.4P,	
  Energy	
  Standard	
  for	
  Data	
  Centers	
  and	
  
Telecommunica0ons	
  Buildings	
  	
  
•  PURPOSE:	
  To	
  establish	
  the	
  minimum	
  energy	
  efficiency	
  requirements	
  
of	
  Data	
  Centers	
  and	
  TelecommunicaLons	
  Buildings,	
  for:	
  	
  
•  Design,	
  construcLon,	
  and	
  a	
  plan	
  for	
  operaLon	
  and	
  maintenance	
  	
  
•  SCOPE:	
  This	
  Standard	
  applies	
  to:	
  	
  
•  New,	
  new	
  addiLons,	
  and	
  modificaLons	
  	
  to	
  Data	
  Centers	
  and	
  
TelecommunicaLons	
  Buildings	
  or	
  porLons	
  thereof	
  and	
  their	
  systems	
  	
  
•  Will	
  set	
  minimum	
  PUE	
  based	
  on	
  climate.	
  
•  More	
  detail	
  at	
  :	
  h?ps://www.ashrae.org/news/2013/ashrae-­‐seeks-­‐
input-­‐on-­‐revisions-­‐to-­‐data-­‐centers-­‐in-­‐90-­‐1-­‐energy-­‐standard-­‐scope	
  
46
1.  Reduce	
  the	
  IT	
  load	
  -­‐	
  VirtualizaLon	
  &	
  ConsolidaLon	
  (up	
  to	
  80%	
  reducLon)	
  
2.  	
  Implement	
  contained	
  hot	
  aisle	
  and	
  cold	
  aisle	
  layout.	
  
̶  Curtains,	
  equipment	
  configura?on,	
  blank	
  panels,	
  cable	
  entrance/exit	
  ports,	
  	
  
3.  Install	
  economizer	
  (air	
  or	
  water)	
  and	
  evaporaLve	
  cooling	
  (direct	
  or	
  indirect).	
  
4.  Raise	
  discharge	
  air	
  temperature.	
  	
  	
  Install	
  VFD’s	
  on	
  all	
  computer	
  room	
  air	
  
condiLoning	
  (CRAC)	
  fans	
  (if	
  used)	
  and	
  network	
  the	
  controls.	
  
5.  Reuse	
  data	
  center	
  waste	
  heat	
  if	
  possible.	
  
6.  Raise	
  the	
  chilled	
  water	
  (if	
  used)	
  set-­‐point.	
  
̶  Increasing	
  chiller	
  water	
  temp	
  by	
  1°C	
  reduces	
  chiller	
  energy	
  use	
  by	
  about	
  3%	
  
7.  Install	
  high	
  efficiency	
  equipment	
  including	
  UPS,	
  power	
  supplies,	
  etc..	
  
8.  Move	
  chilled	
  water	
  as	
  close	
  to	
  server	
  as	
  possible	
  (direct	
  liquid	
  cooling).	
  
9.  Consider	
  centralized	
  high	
  efficiency	
  water	
  cooled	
  chiller	
  plant	
  
̶  Air-­‐cooled	
  =	
  2.9	
  COP,	
  water-­‐cooled	
  =	
  7.8	
  COP	
  
Energy	
  ConservaLon	
  Measures	
  
47
Equipment	
  Environmental	
  SpecificaLon	
  	
  
Air Inlet to IT Equipment is the
important specification to meet
Outlet temperature is not
important to IT Equipment
48
	
  Recommended	
  Range	
  (Statement	
  of	
  Reliability)	
  
Preferred	
  facility	
  opera?on;	
  most	
  values	
  should	
  be	
  within	
  this	
  range.	
  
	
  
	
  Allowable	
  Range	
  (Statement	
  of	
  FuncLonality)	
  
Robustness	
  of	
  equipment;	
  no	
  values	
  should	
  be	
  outside	
  this	
  range.	
  
MAX	
  ALLOWABLE	
  
RACK	
  INTAKE	
  
TEMPERATURE	
  
MAX	
  RECOMMENDED	
  
Over-­‐Temp	
  
Recommended	
  	
  
Range	
  
Under-­‐Temp	
  
MIN	
  RECOMMENDED	
  
MIN	
  ALLOWABLE	
  
Allowable	
  	
  
Range	
  
Key	
  Nomenclature	
  
49
Improve	
  Air	
  Management	
  
•  Typically,	
  more	
  air	
  
circulated	
  than	
  required.	
  	
  
•  Air	
  mixing	
  and	
  short	
  
circuiLng	
  leads	
  to:	
  
–  Low	
  supply	
  temperature	
  
–  Low	
  Delta	
  T	
  
•  Use	
  hot	
  and	
  cold	
  aisles.	
  
•  Improve	
  isolaLon	
  of	
  hot	
  
and	
  cold	
  aisles.	
  
–  Reduce	
  fan	
  energy	
  
–  Improve	
  air-­‐condi?oning	
  
efficiency	
  
–  Increase	
  cooling	
  capacity	
  
49
	
  Hot	
  aisle/cold	
  aisle	
  configuraLon	
  
decreases	
  mixing	
  of	
  intake	
  &	
  
exhaust	
  air,	
  promoLng	
  efficiency.	
  	
  
Source:	
  hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf	
  
50
Isolate	
  Cold	
  and	
  Hot	
  Aisles	
  
Source:	
  hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf	
  
70-80ºF vs. 45-55ºF
95-105ºF vs. 60-70ºF
51
Adding	
  Air	
  Curtains	
  for	
  Hot/Cold	
  IsolaLon	
  
Photo	
  used	
  with	
  permission	
  from	
  the	
  NaLonal	
  Snow	
  and	
  Ice	
  Data	
  Center.	
  h?p://www.nrel.gov/docs/fy12osL/53939.pdf	
  
Courtesy	
  of	
  Henry	
  Coles,	
  Lawrence	
  Berkeley	
  NaLonal	
  Laboratory	
  
53
Three	
  (3)	
  Cooling	
  Device	
  Categories	
  
IT
Equipment
Rack
cooling
water
rack
containment
SERVER
FRONT
1	
  -­‐	
  Rack	
  Cooler	
  
•  APC-­‐water	
  
•  Knürr(CoolTherm)-­‐water	
  
•  Knürr(CoolLoop)-­‐water	
  
•  Rigal-­‐water	
  
2	
  -­‐	
  Row	
  Cooler	
  
•  APC(2*)-­‐water	
  
•  Liebert-­‐refrigerant	
  
IT
Equipment
Rack
IT
Equipment
Rack
IT
Equipment
Rack
row
containment
cooling
water
cooling
water
SERVER
FRONT
3	
  -­‐	
  Passive	
  Door	
  Cooler	
  
•  IBM-­‐water	
  
•  Vege/Coolcentric-­‐water	
  
•  Liebert-­‐refrigerant	
  
•  SUN-­‐refrigerant	
  
SERVER
FRONT
IT
Equipment
Rack
cooling
water
Courtesy	
  of	
  Henry	
  Coles,	
  Lawrence	
  Berkeley	
  Na0onal	
  Laboratory	
  
54
“Chill-­‐off	
  2”	
  EvaluaLon	
  of	
  Close-­‐
coupled	
  Cooling	
  SoluLons	
  	
  
Courtesy	
  of	
  Geoffrey	
  Bell	
  and	
  Henry	
  Coles,	
  Lawrence	
  Berkeley	
  Na0onal	
  Laboratory	
  
less energy
use
55
Cooling	
  Takeaways…	
  
•  Use	
  a	
  central	
  plant	
  (e.g.	
  chiller/CRAHs)	
  vs.	
  CRAC	
  units	
  
•  Use	
  centralized	
  controls	
  on	
  CRAC/CRAH	
  units	
  to	
  prevent	
  
simultaneous	
  humidifying	
  and	
  dehumidifying.	
  
•  Move	
  to	
  liquid	
  cooling	
  (room,	
  row,	
  rack,	
  chip)	
  
•  Consider	
  VSDs	
  on	
  fans,	
  pumps,	
  chillers,	
  and	
  towers	
  
•  Use	
  air-­‐	
  or	
  water-­‐side	
  free	
  cooling.	
  
•  Expand	
  humidity	
  range	
  and	
  improve	
  humidity	
  control	
  (or	
  
disconnect).	
  

Más contenido relacionado

La actualidad más candente

Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Upsite Technologies
 
The Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesThe Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesUpsite Technologies
 
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sData Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sUpsite Technologies
 
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually ExclusiveFor Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually ExclusiveUpsite Technologies
 
Myths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's NotMyths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's NotUpsite Technologies
 
How IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual UnderstandingHow IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual UnderstandingUpsite Technologies
 
Cold Aisle Containment – Maximum kW per Rack
Cold Aisle Containment – Maximum kW per RackCold Aisle Containment – Maximum kW per Rack
Cold Aisle Containment – Maximum kW per RackKnurr USA
 
Green buildings : Challange in Operation and Maintenance
Green buildings : Challange in Operation and MaintenanceGreen buildings : Challange in Operation and Maintenance
Green buildings : Challange in Operation and MaintenanceTejwant Navalkar
 
Sizing of solar cooling systems
Sizing of solar cooling systemsSizing of solar cooling systems
Sizing of solar cooling systemsSolarReference
 
Data center cooling infrastructure slide
Data center cooling infrastructure slideData center cooling infrastructure slide
Data center cooling infrastructure slideLivin Jose
 
Cooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center CoolingCooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center CoolingUpsite Technologies
 

La actualidad más candente (11)

Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
 
The Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesThe Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best Practices
 
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sData Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
 
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually ExclusiveFor Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
 
Myths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's NotMyths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's Not
 
How IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual UnderstandingHow IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual Understanding
 
Cold Aisle Containment – Maximum kW per Rack
Cold Aisle Containment – Maximum kW per RackCold Aisle Containment – Maximum kW per Rack
Cold Aisle Containment – Maximum kW per Rack
 
Green buildings : Challange in Operation and Maintenance
Green buildings : Challange in Operation and MaintenanceGreen buildings : Challange in Operation and Maintenance
Green buildings : Challange in Operation and Maintenance
 
Sizing of solar cooling systems
Sizing of solar cooling systemsSizing of solar cooling systems
Sizing of solar cooling systems
 
Data center cooling infrastructure slide
Data center cooling infrastructure slideData center cooling infrastructure slide
Data center cooling infrastructure slide
 
Cooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center CoolingCooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center Cooling
 

Similar a NREL Data Center Design Saves 95% on Energy Costs

Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELBits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELinside-BigData.com
 
Aurora hpc energy efficiency
Aurora hpc energy efficiencyAurora hpc energy efficiency
Aurora hpc energy efficiencyEurotech Aurora
 
The Economics of Green HPC
The Economics of Green HPCThe Economics of Green HPC
The Economics of Green HPCIntel IT Center
 
EnergyEffic.pdf
EnergyEffic.pdfEnergyEffic.pdf
EnergyEffic.pdfsash236
 
Total Liquid Cooling
Total Liquid CoolingTotal Liquid Cooling
Total Liquid CoolingIceotopePR
 
LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results! LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results! Daren Klum
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingGraybar
 
NREL’s Research Support Facility: Making Plug Loads Count
NREL’s Research Support Facility:  Making Plug Loads CountNREL’s Research Support Facility:  Making Plug Loads Count
NREL’s Research Support Facility: Making Plug Loads CountShanti Pless
 
Benchmarking/Performance Measuring
Benchmarking/Performance MeasuringBenchmarking/Performance Measuring
Benchmarking/Performance MeasuringASHRAE Region VI
 
Data Center PUE Reconsidered
Data Center PUE Reconsidered Data Center PUE Reconsidered
Data Center PUE Reconsidered Raritan
 
Data center-server-cooling-power-management-paper
Data center-server-cooling-power-management-paperData center-server-cooling-power-management-paper
Data center-server-cooling-power-management-paperDileep Bhandarkar
 
Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015
Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015
Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015buildpulse
 
CPD Presentation Evaporative cooling in data centres
CPD Presentation   Evaporative cooling in data centresCPD Presentation   Evaporative cooling in data centres
CPD Presentation Evaporative cooling in data centresColt UK
 
Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013
Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013
Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013Shanti Pless
 

Similar a NREL Data Center Design Saves 95% on Energy Costs (20)

Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELBits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
 
Aurora hpc energy efficiency
Aurora hpc energy efficiencyAurora hpc energy efficiency
Aurora hpc energy efficiency
 
The Economics of Green HPC
The Economics of Green HPCThe Economics of Green HPC
The Economics of Green HPC
 
EnergyEffic.pdf
EnergyEffic.pdfEnergyEffic.pdf
EnergyEffic.pdf
 
Total Liquid Cooling
Total Liquid CoolingTotal Liquid Cooling
Total Liquid Cooling
 
Steve hammond nrel
Steve hammond nrelSteve hammond nrel
Steve hammond nrel
 
LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results! LiquidCool Solutions - NREL test results!
LiquidCool Solutions - NREL test results!
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for Cooling
 
Showcase ppt ver 8
Showcase ppt ver 8Showcase ppt ver 8
Showcase ppt ver 8
 
Showcase ppt ver 8
Showcase ppt ver 8Showcase ppt ver 8
Showcase ppt ver 8
 
NREL’s Research Support Facility: Making Plug Loads Count
NREL’s Research Support Facility:  Making Plug Loads CountNREL’s Research Support Facility:  Making Plug Loads Count
NREL’s Research Support Facility: Making Plug Loads Count
 
Ashrae thermal guidelines svlg 2015 (1)
Ashrae thermal guidelines  svlg 2015 (1)Ashrae thermal guidelines  svlg 2015 (1)
Ashrae thermal guidelines svlg 2015 (1)
 
Benchmarking/Performance Measuring
Benchmarking/Performance MeasuringBenchmarking/Performance Measuring
Benchmarking/Performance Measuring
 
Green computing
Green computingGreen computing
Green computing
 
Data Center PUE Reconsidered
Data Center PUE Reconsidered Data Center PUE Reconsidered
Data Center PUE Reconsidered
 
Data center-server-cooling-power-management-paper
Data center-server-cooling-power-management-paperData center-server-cooling-power-management-paper
Data center-server-cooling-power-management-paper
 
Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015
Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015
Utilizing Analytics to Drive Change in Buildings - Apem Sept 18 2015
 
CPD Presentation Evaporative cooling in data centres
CPD Presentation   Evaporative cooling in data centresCPD Presentation   Evaporative cooling in data centres
CPD Presentation Evaporative cooling in data centres
 
PUE Reconsidered
PUE ReconsideredPUE Reconsidered
PUE Reconsidered
 
Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013
Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013
Plug Load Efficiency for Zero Energy Buildings Webinar 1 29 2013
 

Más de JISC's Green ICT Programme

What does central IT really cost? - An attempt to find out
What does central IT really cost? - An attempt to find outWhat does central IT really cost? - An attempt to find out
What does central IT really cost? - An attempt to find outJISC's Green ICT Programme
 
Data Centre Compute and Overhead Costs - Delivering End-to-end KPIs
Data Centre Compute and Overhead Costs - Delivering End-to-end KPIsData Centre Compute and Overhead Costs - Delivering End-to-end KPIs
Data Centre Compute and Overhead Costs - Delivering End-to-end KPIsJISC's Green ICT Programme
 
Better Decision Making by Understanding IT Spend
Better Decision Making by Understanding IT SpendBetter Decision Making by Understanding IT Spend
Better Decision Making by Understanding IT SpendJISC's Green ICT Programme
 
Understanding IT Costs - Part of Jisc's Costing IT Services Project
Understanding IT Costs - Part of Jisc's Costing IT Services ProjectUnderstanding IT Costs - Part of Jisc's Costing IT Services Project
Understanding IT Costs - Part of Jisc's Costing IT Services ProjectJISC's Green ICT Programme
 
CARBS Project Presentation - Jisc Cost of IT Services 10-02-14
CARBS Project Presentation - Jisc Cost of IT Services 10-02-14CARBS Project Presentation - Jisc Cost of IT Services 10-02-14
CARBS Project Presentation - Jisc Cost of IT Services 10-02-14JISC's Green ICT Programme
 
Circling the Square! - Understanding and Minimising the Costs of Providing Di...
Circling the Square! - Understanding and Minimising the Costs of Providing Di...Circling the Square! - Understanding and Minimising the Costs of Providing Di...
Circling the Square! - Understanding and Minimising the Costs of Providing Di...JISC's Green ICT Programme
 
Migrating from a physical to a hosted Data Centre - Experiences of a small Un...
Migrating from a physical to a hosted Data Centre - Experiences of a small Un...Migrating from a physical to a hosted Data Centre - Experiences of a small Un...
Migrating from a physical to a hosted Data Centre - Experiences of a small Un...JISC's Green ICT Programme
 
The “Financial X-ray” of IT Services costs -
The “Financial X-ray” of IT Services costs - The “Financial X-ray” of IT Services costs -
The “Financial X-ray” of IT Services costs - JISC's Green ICT Programme
 
Energy Efficient Server Rooms at the University of Cambridge
Energy Efficient Server Rooms at the University of CambridgeEnergy Efficient Server Rooms at the University of Cambridge
Energy Efficient Server Rooms at the University of CambridgeJISC's Green ICT Programme
 
What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...
What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...
What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...JISC's Green ICT Programme
 
Understanding Data Centre Costs: Lessons from e-InfraNet and JISC activities
Understanding Data Centre Costs: Lessons from e-InfraNet and JISC activitiesUnderstanding Data Centre Costs: Lessons from e-InfraNet and JISC activities
Understanding Data Centre Costs: Lessons from e-InfraNet and JISC activitiesJISC's Green ICT Programme
 
Ppt 1 le leeds - welcome alan real ( university of leeds )
Ppt 1 le   leeds - welcome alan real ( university of leeds )Ppt 1 le   leeds - welcome alan real ( university of leeds )
Ppt 1 le leeds - welcome alan real ( university of leeds )JISC's Green ICT Programme
 
Ppt6 london - james harbridge ( intellect uk ) crc policy update
Ppt6   london - james harbridge ( intellect uk ) crc policy updatePpt6   london - james harbridge ( intellect uk ) crc policy update
Ppt6 london - james harbridge ( intellect uk ) crc policy updateJISC's Green ICT Programme
 
Ppt5 exp lonodn - kevin cope & alex yakimov ( imperial college ) data cent...
Ppt5   exp lonodn - kevin cope & alex yakimov ( imperial college )  data cent...Ppt5   exp lonodn - kevin cope & alex yakimov ( imperial college )  data cent...
Ppt5 exp lonodn - kevin cope & alex yakimov ( imperial college ) data cent...JISC's Green ICT Programme
 
Ppt4 london - michael rudgyard ( concurrent thinking ) driving efficiencie...
Ppt4   london -  michael rudgyard ( concurrent thinking ) driving efficiencie...Ppt4   london -  michael rudgyard ( concurrent thinking ) driving efficiencie...
Ppt4 london - michael rudgyard ( concurrent thinking ) driving efficiencie...JISC's Green ICT Programme
 
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...JISC's Green ICT Programme
 
Ppt4 exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...
Ppt4   exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...Ppt4   exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...
Ppt4 exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...JISC's Green ICT Programme
 
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3   london - sophia ( operation intelligence ) what is the eu code of conductPpt3   london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conductJISC's Green ICT Programme
 
Ppt2 london - mike walker ( defra ) background to the eu code of conduct j...
Ppt2   london - mike walker ( defra )  background to the eu code of conduct j...Ppt2   london - mike walker ( defra )  background to the eu code of conduct j...
Ppt2 london - mike walker ( defra ) background to the eu code of conduct j...JISC's Green ICT Programme
 
Ppt1 london -simon allen ( concurrent thinking ) welcome
Ppt1   london -simon allen ( concurrent thinking ) welcomePpt1   london -simon allen ( concurrent thinking ) welcome
Ppt1 london -simon allen ( concurrent thinking ) welcomeJISC's Green ICT Programme
 

Más de JISC's Green ICT Programme (20)

What does central IT really cost? - An attempt to find out
What does central IT really cost? - An attempt to find outWhat does central IT really cost? - An attempt to find out
What does central IT really cost? - An attempt to find out
 
Data Centre Compute and Overhead Costs - Delivering End-to-end KPIs
Data Centre Compute and Overhead Costs - Delivering End-to-end KPIsData Centre Compute and Overhead Costs - Delivering End-to-end KPIs
Data Centre Compute and Overhead Costs - Delivering End-to-end KPIs
 
Better Decision Making by Understanding IT Spend
Better Decision Making by Understanding IT SpendBetter Decision Making by Understanding IT Spend
Better Decision Making by Understanding IT Spend
 
Understanding IT Costs - Part of Jisc's Costing IT Services Project
Understanding IT Costs - Part of Jisc's Costing IT Services ProjectUnderstanding IT Costs - Part of Jisc's Costing IT Services Project
Understanding IT Costs - Part of Jisc's Costing IT Services Project
 
CARBS Project Presentation - Jisc Cost of IT Services 10-02-14
CARBS Project Presentation - Jisc Cost of IT Services 10-02-14CARBS Project Presentation - Jisc Cost of IT Services 10-02-14
CARBS Project Presentation - Jisc Cost of IT Services 10-02-14
 
Circling the Square! - Understanding and Minimising the Costs of Providing Di...
Circling the Square! - Understanding and Minimising the Costs of Providing Di...Circling the Square! - Understanding and Minimising the Costs of Providing Di...
Circling the Square! - Understanding and Minimising the Costs of Providing Di...
 
Migrating from a physical to a hosted Data Centre - Experiences of a small Un...
Migrating from a physical to a hosted Data Centre - Experiences of a small Un...Migrating from a physical to a hosted Data Centre - Experiences of a small Un...
Migrating from a physical to a hosted Data Centre - Experiences of a small Un...
 
The “Financial X-ray” of IT Services costs -
The “Financial X-ray” of IT Services costs - The “Financial X-ray” of IT Services costs -
The “Financial X-ray” of IT Services costs -
 
Energy Efficient Server Rooms at the University of Cambridge
Energy Efficient Server Rooms at the University of CambridgeEnergy Efficient Server Rooms at the University of Cambridge
Energy Efficient Server Rooms at the University of Cambridge
 
What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...
What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...
What does central IT really cost? An attempt to find out! - Heidi Fraser-Krau...
 
Understanding Data Centre Costs: Lessons from e-InfraNet and JISC activities
Understanding Data Centre Costs: Lessons from e-InfraNet and JISC activitiesUnderstanding Data Centre Costs: Lessons from e-InfraNet and JISC activities
Understanding Data Centre Costs: Lessons from e-InfraNet and JISC activities
 
Ppt 1 le leeds - welcome alan real ( university of leeds )
Ppt 1 le   leeds - welcome alan real ( university of leeds )Ppt 1 le   leeds - welcome alan real ( university of leeds )
Ppt 1 le leeds - welcome alan real ( university of leeds )
 
Ppt6 london - james harbridge ( intellect uk ) crc policy update
Ppt6   london - james harbridge ( intellect uk ) crc policy updatePpt6   london - james harbridge ( intellect uk ) crc policy update
Ppt6 london - james harbridge ( intellect uk ) crc policy update
 
Ppt5 exp lonodn - kevin cope & alex yakimov ( imperial college ) data cent...
Ppt5   exp lonodn - kevin cope & alex yakimov ( imperial college )  data cent...Ppt5   exp lonodn - kevin cope & alex yakimov ( imperial college )  data cent...
Ppt5 exp lonodn - kevin cope & alex yakimov ( imperial college ) data cent...
 
Ppt4 london - michael rudgyard ( concurrent thinking ) driving efficiencie...
Ppt4   london -  michael rudgyard ( concurrent thinking ) driving efficiencie...Ppt4   london -  michael rudgyard ( concurrent thinking ) driving efficiencie...
Ppt4 london - michael rudgyard ( concurrent thinking ) driving efficiencie...
 
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...
 
Ppt4 exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...
Ppt4   exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...Ppt4   exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...
Ppt4 exp birmingham - steve bowes phipps ( university of hertfordshire ) - ...
 
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3   london - sophia ( operation intelligence ) what is the eu code of conductPpt3   london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conduct
 
Ppt2 london - mike walker ( defra ) background to the eu code of conduct j...
Ppt2   london - mike walker ( defra )  background to the eu code of conduct j...Ppt2   london - mike walker ( defra )  background to the eu code of conduct j...
Ppt2 london - mike walker ( defra ) background to the eu code of conduct j...
 
Ppt1 london -simon allen ( concurrent thinking ) welcome
Ppt1   london -simon allen ( concurrent thinking ) welcomePpt1   london -simon allen ( concurrent thinking ) welcome
Ppt1 london -simon allen ( concurrent thinking ) welcome
 

Último

Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxKatpro Technologies
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 

Último (20)

Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 

NREL Data Center Design Saves 95% on Energy Costs

  • 1. NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC. US  Trends  in  Data  Centre  Design  with   NREL  Examples  of  Large  Energy  Savings   Understanding  and  Minimising   The  Costs  of  Data  Centre   Based  IT  Services  Conference     University  of  Liverpool   O?o  Van  Geet,  PE   June  17,  2013        
  • 3. 3 BPG  Table  of  Contents   •  Summary   •  Background   •  Informa?on  Technology   Systems   •  Environmental  CondiLons   •  Air  Management   •  Cooling  Systems   •  Electrical  Systems   •  Other  Opportuni?es  for   Energy  Efficient  Design   •  Data  Center  Metrics  &   Benchmarking  
  • 4. 4 CPUs   ~65C   (149F)   GPUs   ~75C   (167F)  Memory   ~85C   (185F)   CPU,  GPU  &  Memory,  represent  ~75-­‐90%  of  heat  load  …   s s Safe  Temperature  Limits  
  • 5. 5  Data  Center  equipment’s  environmental  condiLons  should  fall   within  the  ranges  established  by  ASHRAE  as  published  in  the   Thermal  Guidelines  book.     Environmental  CondiLons   ASHRAE  Reference:  ASHRAE  (2008),  (2011)     (@  Equipment  Intake) Recommended Allowable Temperature        Data  Centers  ASHRAE     18°  –  27°C   15°  –  32°C  (A1)   5°  –  45°C  (A4)   Humidity  (RH)      Data  Centers  ASHRAE       5.5°C  DP  –     60%    RH  and   15oC  DP   20%  –   80%  RH Environmental  SpecificaLons  (°C)  
  • 6. 6 2011  ASHRAE  Allowable  Ranges   Dry Bulb Temperature
  • 7. 7 Psychrometric  Bin  Analysis   0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 0.022 0.024 0.026 0.028 0.03 0.032 0.034 0.036 30 40 50 60 70 80 90 100 110 120 HumidityRatio(lbWater/lbDryAir) Dry-Bulb Temperature (ºF) Boulder, Colorado TMY3 Weather Data TMY3 Weather Data Class 1 Recommended Range Class 1 Allowable Range 60ºF 50ºF 40ºF RelativeHumidity 80% 60% 40% 20% 100% 70ºF 60ºF 50ºF 40ºF RelativeHumidity 80% 60% 40% 20% 100% 80ºF Design  Condi?ons  (0.4%):   91.2  db,  60.6  wb  
  • 8. 8 EsLmated  Savings   Baseline   System   DX  Cooling  with  no  economizer   Load   1  ton  of  cooling,  constant  year-­‐round   Efficiency  (COP)   3   Total  Energy  (kWh/yr)   10,270   RECOMMENDED  RANGE   ALLOWABLE  RANGE   Results   Hours   Energy  (kWh)   Hours   Energy  (kWh)   Zone1:    DX  Cooling  Only   25   8   2   1   Zone2:    Mul?stage  Indirect  Evap.  +  DX  (H80)   26   16   4   3   Zone3:    Mul?stage  Indirect  Evap.  Only   3   1   0   0   Zone4:    Evap.  Cooler  Only   867   97   510   57   Zone5:    Evap.  Cooler  +  Outside  Air   6055   417   1656   99   Zone6:    Outside  Air    Only   994   0   4079   0   Zone7:    100%  Outside  Air   790   0   2509   0   Total   8,760   538   8,760   160   Es0mated  %  Savings   -­‐   95%   -­‐   98%  
  • 9. 9 Data  Center  Efficiency  Metric   •  Power  Usage  EffecLveness  (P.U.E.)  is  an  industry  standard   data  center  efficiency  metric.   •  The  raLo  of  power  used  or  lost  by  data  center  facility   infrastructure  (pumps,  lights,  fans,  conversions,  UPS…)  to   power  used  by  compute.   •  Not  perfect,  some  folks  play  games  with  it.   •  2011  survey  esLmates  industry  average  is  1.8.   •  Typical  data  center,  half  of  power  goes  to  things  other  than   compute  capability.   9 “IT power” + “Facility power” P.U.E. = “IT power”
  • 10. 10 PUE  –  Simple  and  EffecLve  
  • 11. 11 -20 0 20 40 60 80 100 0.75 0.85 0.95 1.05 1.15 1.25 1.35 1.45 OutdoorTemperature(°F) PUE Data  Center  PUE   -20 0 20 40 60 80 100 0.75 0.85 0.95 1.05 1.15 1.25 1.35 1.45 OutdoorTemperature(°F) PUE Data Center PUE Outdoor Temperature
  • 12. “I  am  re-­‐using  waste   heat  from  my  data   center  on  another   part  of  my  site  and   my  PUE  is  0.8!”   ASHRAE  &  friends  (DOE,  EPA,   TGG,  7x24,  etc..)  do  not  allow   reused  energy  in  PUE  &  PUE  is   always  >1.0.     Another  metric  has  been   developed  by  The  Green  Grid  +;   ERE  –  Energy  Reuse  EffecLveness.   h?p://www.thegreengrid.org/en/Global/Content/white-­‐papers/ERE  
  • 13. 13 ERE  –  Adds  Energy  Reuse   Utility Cooling UPS PDU IT Rejected Energy (a) (b) (c) (d) (f) (e) Reused Energy (g)
  • 14. 14 Credit:  Haselden  ConstrucLon   •  More  than  1300  people  in  DOE   office  space  on  NREL’s  campus   •  33,445    m2     •  Design/build  process  with   required  energy  goals     ̶         50%  energy  savings  from  code   ̶         LEED  Pla?num   •  Replicable   ̶  Process     ̶  Technologies   ̶  Cost   •  Site,  source,  carbon,  cost  ZEB:B   ̶         Includes  plugs  loads  and  datacenter   •  Firm  fixed  price    -­‐    US  $22.8/m2   construcLon  cost  (not  including   $2.5/m2  for  PV  from  PPA/ ARRA)   •  Opened  June  10,  2010  (First   Phase)   DOE/NREL  Research  Support  Facility    
  • 15. 15 RSF  Datacenter   •  Fully  containing  hot  aisle   –  Custom  aisle  floor  and  door  seals   –  Ensure  equipment  designed  for   cold   aisle  containment     §  And  installed  to  pull  cold  air   Ø  Not  hot  air…   –       1.18  annual  PUE     –         ERE  =  0.9     •  Control  hot  aisle  based  on   return  temperature  of  ~90F.   •  Waste  heat  used  to  heat  building.   •  Outside  air  and  EvaporaLve   Cooling   •  Low  fan  energy  design   •  176  Sq  m.   Credit:  Marjorie  Scho?/NREL  
  • 16. 16 16
  • 17. 17 17 Data Center Load GROWTH (40+ kW in 2 years) since NO recharge!
  • 18. 18 Move  to  Liquid  Cooling   •  Server  fans  are  inefficient  and  noisy.   –  Liquid  doors  are  an  improvement  but  we  can  do   beger!   •  Power  densiLes  are  rising  making  component-­‐ level  liquid  cooling  soluLons  more  appropriate.   •  Liquid  Benefit   –  Thermal  stability,  reduced  component  failures.   –  Beger  waste  heat  re-­‐use  op?ons.   –  Warm  water  cooling,  reduce/eliminate   condensa?on.   –  Provide  cooling  with  higher  temperature  coolant.   •  Eliminate  expensive  &  inefficient  chillers.   •  Save  wasted  fan  energy  and  use  it  for   compuLng.   •  Unlock  your  cores  and  overclock  to  increase   throughput!  
  • 19. 19 Liquid  Cooling  –  Overview   Water  and  other  liquids  (dielectrics,  glycols  and  refrigerants)   may  be  used  for  heat  removal.     •  Liquids  typically  use  LESS  transport  energy     (14.36  Air  to  Water  Horsepower  ra?o  for  example  below).     •  Liquid-­‐to-­‐liquid  heat  exchangers  have  closer  approach  temps  than   Liquid-­‐to-­‐air  (coils),  yielding  increased  outside  air  hours.  
  • 20. 20 2011  ASHRAE  Liquid  Cooling  Guidelines   NREL  ESIF  HPC  (HP  hardware)  using  24  C  supply,  40  C  return  –W4/W5  
  • 21. 21 NREL  HPC  Data  Center   Showcase  Facility   •  10MW,  929  m2     •  Leverage  favorable  climate   •  Use  direct  water  to  rack   cooling   •  DC  manager  responsible  for   ALL  DC  cost  including  energy!   •  Waste  heat  captured  and  used   to  heat  labs  &  offices.   •  World’s  most  energy  efficient   data  center,  PUE  1.06!   •  Lower  CapEx  and  OpEx.   Leveraged  exper0se  in  energy  efficient   buildings  to  focus  on  showcase  data  center.   Chips to bricks approach •  Opera?onal  1-­‐2013,  Petascale+   HPC  Capability  in  8-­‐2013   •  20-­‐year  planning  horizon   ̶  5  to  6  HPC  genera?ons.   High   Performance   CompuLng  
  • 22. 22 CriLcal  Data  Center  Specs   •  Warm  water  cooling,  24C   ̶  Water  much  beger  working  fluid  than  air   -­‐  pumps  trump  fans.   ̶  U?lize  high  quality  waste  heat,  40C  or   warmer.   ̶  +90%  IT  heat  load  to  liquid.   •  High  power  distribuLon   ̶  480VAC,  Eliminate  conversions.   •  Think  outside  the  box   ̶  Don’t  be  sa?sfied  with  an  energy  efficient   data  center  nestled  on  campus   surrounded  by  inefficient  laboratory  and   office  buildings.   ̶  Innovate,  integrate,  op?mize.   Dashboards  report  instantaneous,  seasonal  and  cumulaLve  PUE  values.  
  • 23. 23 •  Data  center  equivalent  of  the  “visible  man”   –  Reveal  not  just  boxes  with  blinky  lights,  but  the  inner  workings  of   the  building  as  well.   –  Tour  views  into  pump  room  and  mechanical  spaces   –  Color  code  pipes,  LCD  monitors   NREL  ESIF  Data  Center  Cross  SecLon  
  • 24. 24 •  2.5 MW – Day one capacity (Utility $500K/ yr/MW) •  10 MW – Ultimate Capacity •  Petaflop •  No Vapor Compression for Cooling Data Center
  • 25. 25 Summer Cooling Mode PUE – Typical Data Center = 1.5 – 2.0 NREL ESIF= 1.04 * 30% more energy efficient than your typical “green” data center Data Center
  • 26. 26 Winter Cooling Mode ERE – Energy Reuse Effectiveness How efficient are we using the waste heat to heat the rest of the building? NREL ESIF= .7 (we use 30% of waste heat) (more with future campus loops) Future Campus Heating Loop Future Campus Heating Loop High Bay Heating Loop Office Heating Loop Conference Heating Loop Data Center
  • 27. 27 95 deg Air 75 deg Air • Water to rack Cooling for High Performance Computers handles 90% of total load • Air Cooling for Legacy Equipment handles 10% of total Load Data Center – Cooling Strategy
  • 28. 28          PUE  1.0X  -­‐-­‐  Focus  on  the  “1”   Facility PUE IT Power Consumption Energy Re-use We all know how to do this! True efficiency requires 3-D optimization.
  • 29. 29 Facility PUE IT Power Consumption Energy Re-use We all know how to do this! Increased work per watt Reduce or eliminate fans Component level heat exchange Newest processors are more efficient. True efficiency requires 3-D optimization.          PUE  1.0X  -­‐-­‐  Focus  on  the  “1”  
  • 30. 30 Facility PUE IT Power Consumption Energy Re-use True efficiency requires 3-D optimization. We all know how to do this! Increased work per watt Reduce or eliminate fans Component level heat exchange Newest processors are more efficient. Direct liquid cooling, Higher return water temps Holistic view of data center planning        PUE  1.0X  -­‐-­‐  Focus  on  the  “1”  
  • 31. 31 What’s  Next?   ü  Energy  Efficient  supporLng  infrastructure.   ü  Pumps,  large  pipes,  high  voltage  (380  to  480)    electrical  to  rack   ü  Efficient  HPC  for  planned  workload.   ü  Capture  and  re-­‐use  waste  heat.   Can  we  manage  and  “opLmize”  workflows,  with  varied  job  mix,   within  a  given  energy  “budget”?     Can  we  do  this  as  part  of  a  larger  “ecosystem”?       31 Steve Hammond
  • 32. 32 Other  Factors   32 5 DemandSMART: Comprehensive Demand Response Balancing supply and demand on the electricity grid is difficult and expensive. End users that provide a balancing resource are compensated for the service. Annual Electricity Demand As a Percent of Available Capacity 50% 100% Winter Spring Summer Fall 75% 25% 90% 4MW solar Use waste heat Better rates, shed load DC as part of Campus Energy System
  • 33. 33 ParLng  Thoughts   •  Energy Efficient Data Centers – been there, done that –  We know how, let’s just apply best practices. –  Don’t fear H20: Liquid cooling will be increasingly prevalent. •  Metrics will lead us into sustainability –  If you don’t measure/monitor it, you can’t manage it. –  As PUE has done; ERE, Carbon Use Effectiveness (CUE), etc. will help drive sustainability. •  Energy Efficient and Sustainable Computing – it’s all about the “1” –  1.0 or 0.06? Where do we focus? Compute & Energy Reuse. •  Holistic approaches to Energy Management. –  Lots of open research questions. –  Projects may get an energy allocation rather than a node-hour allocation.
  • 34. 34 Otto VanGeet 303.384.7369 Otto.VanGeet@nrel.gov     NREL  RSF   50%  of  code  energy  use   Net  zero  annual  energy   $22.8/m2    ConstrucLon  Cost     QUESTIONS?  
  • 35. 35 •  Thermoelectric  power  generaLon  (coal,  oil,   natural  gas  and  nuclear)  consumes  about  1.1   gallon  per  kW  hour,  on  average.   •  This  amounts  to  about  9.6  M  gallons  per  MW   year.   •  We  esLmate  about  2.5  M  gallons  water   consumed  per  MW  year  for  on-­‐site  evaporaLve   cooling  towers  at  NREL.   •  If  chillers  need  0.2MW  per  MW  of  HPC  power,   then  chillers  have  an  impact  of  2.375M  gallons   per  year  per  MW.   •  Actuals  will  depend  on  your  site,  but  evap.   cooling  doesn’t  necessarily  result  in  a  net   increase  in  water  use.     •  Low  Energy  use  =  Lower  water  use.  Energy  Reuse   uses  NO  water!   Water  ConsideraLons   “We shouldn’t use evaporative cooling, water is scarce.” NREL PIX 00181
  • 36. 36 Data  Center  Efficiency   •  Choices regarding power, packaging, cooling, and energy recovery in data centers drive TCO. •  Why should we care? •  Carbon footprint. •  Water usage. •  Mega$ per MW year. •  Cost: OpEx ~ IT CapEx! •  A  less  efficient  data  center  takes  away  power  and  dollars  that  could   otherwise  be  used  for  compute  capability.  
  • 37. 37 HolisLc  Thinking   •  Approach  to  Cooling:  Air  vs  Liquid  and  where?   –  Components,  Liquid  Doors  or  CRACs,  …   •  What  is  your  “ambient”  Temperature?   –  55F,  65F,  75F,  85F,  95F,  105F  …   –  13C,  18C,  24C,  30C,  35C,  40.5C  …   •  Electrical  distribuLon:     –  208v  or  480v?   •  “Waste”  Heat:     –  How  hot?      Liquid  or  Air?      Throw  it  away  or  Use  it?  
  • 38. 38 Liquid  Cooling  –  New  ConsideraLons   •  Air  Cooling   –  Humidity       –  Fan  failures   –  Air  side  economizers,  par?culates   •  Liquid  Cooling   –  pH  &  bacteria   –  Dissolved  solids   –  Corrosion  inhibitors,  etc.   •  When  considering  liquid  cooled   systems,  insist  that  providers  adhere   to  the  latest  ASHRAE  water  quality   spec  or  it  could  be  costly.  
  • 39. 39 2011  ASHRAE  Liquid  Cooling  Guidelines  
  • 40. 40 2011  ASHRAE  Thermal  Guidelines   2011  Thermal  Guidelines  for  Data  Processing  Environments  –  Expanded  Data  Center  Classes  and   Usage  Guidance.    White  paper  prepared  by  ASHRAE  Technical  Commi?ee  TC  9.9  
  • 41. 41 Energy  Savings  PotenLal:  Economizer  Cooling   Energy  savings  poten?al  for  recommended   envelope,  Stage  1:  Economizer  Cooling.12   (Source:  Billy  Roberts,  NREL)    
  • 42. 42 Data  Center  Energy   •  Data  centers  are  energy  intensive   faciliLes.   –  10-­‐100x  more  energy  intensive  than  an  office.   –  Server  racks  well  in  excess  of  30kW.   –  Power  and  cooling  constraints  in  exis?ng   facili?es.   •  Data  Center  inefficiency  steals  power  that   would  otherwise  support  compute   capability.   •  Important  to  have  DC  manager   responsible  for  ALL  DC  cost  including   energy!      
  • 43. 43 Energy  Savings  PotenLal:     Economizer  +  Direct  EvaporaLve  Cooling   Energy  savings  poten?al  for  recommended  envelope,   Stage  2:  Economizer  +  Direct  Evap.  Cooling.12   (Source:  Billy  Roberts,  NREL)    
  • 44. 44 Energy  Savings  PotenLal:  Economizer  +  Direct   Evap.  +  MulLstage  Indirect  Evap.  Cooling   Energy  savings  poten?al  for  recommended   envelope,  Stage  3:  Economizer  +  Direct   Evap.  +  Mul?stage  Indirect  Evap.  Cooling.12   (Source:  Billy  Roberts,  NREL)    
  • 45. 45 Data  Center  Energy  Efficiency   •  ASHRAE  90.1  2011  requires  economizer  in  most  data  centers.   •  ASHRAE  Standard  90.4P,  Energy  Standard  for  Data  Centers  and   Telecommunica0ons  Buildings     •  PURPOSE:  To  establish  the  minimum  energy  efficiency  requirements   of  Data  Centers  and  TelecommunicaLons  Buildings,  for:     •  Design,  construcLon,  and  a  plan  for  operaLon  and  maintenance     •  SCOPE:  This  Standard  applies  to:     •  New,  new  addiLons,  and  modificaLons    to  Data  Centers  and   TelecommunicaLons  Buildings  or  porLons  thereof  and  their  systems     •  Will  set  minimum  PUE  based  on  climate.   •  More  detail  at  :  h?ps://www.ashrae.org/news/2013/ashrae-­‐seeks-­‐ input-­‐on-­‐revisions-­‐to-­‐data-­‐centers-­‐in-­‐90-­‐1-­‐energy-­‐standard-­‐scope  
  • 46. 46 1.  Reduce  the  IT  load  -­‐  VirtualizaLon  &  ConsolidaLon  (up  to  80%  reducLon)   2.   Implement  contained  hot  aisle  and  cold  aisle  layout.   ̶  Curtains,  equipment  configura?on,  blank  panels,  cable  entrance/exit  ports,     3.  Install  economizer  (air  or  water)  and  evaporaLve  cooling  (direct  or  indirect).   4.  Raise  discharge  air  temperature.      Install  VFD’s  on  all  computer  room  air   condiLoning  (CRAC)  fans  (if  used)  and  network  the  controls.   5.  Reuse  data  center  waste  heat  if  possible.   6.  Raise  the  chilled  water  (if  used)  set-­‐point.   ̶  Increasing  chiller  water  temp  by  1°C  reduces  chiller  energy  use  by  about  3%   7.  Install  high  efficiency  equipment  including  UPS,  power  supplies,  etc..   8.  Move  chilled  water  as  close  to  server  as  possible  (direct  liquid  cooling).   9.  Consider  centralized  high  efficiency  water  cooled  chiller  plant   ̶  Air-­‐cooled  =  2.9  COP,  water-­‐cooled  =  7.8  COP   Energy  ConservaLon  Measures  
  • 47. 47 Equipment  Environmental  SpecificaLon     Air Inlet to IT Equipment is the important specification to meet Outlet temperature is not important to IT Equipment
  • 48. 48  Recommended  Range  (Statement  of  Reliability)   Preferred  facility  opera?on;  most  values  should  be  within  this  range.      Allowable  Range  (Statement  of  FuncLonality)   Robustness  of  equipment;  no  values  should  be  outside  this  range.   MAX  ALLOWABLE   RACK  INTAKE   TEMPERATURE   MAX  RECOMMENDED   Over-­‐Temp   Recommended     Range   Under-­‐Temp   MIN  RECOMMENDED   MIN  ALLOWABLE   Allowable     Range   Key  Nomenclature  
  • 49. 49 Improve  Air  Management   •  Typically,  more  air   circulated  than  required.     •  Air  mixing  and  short   circuiLng  leads  to:   –  Low  supply  temperature   –  Low  Delta  T   •  Use  hot  and  cold  aisles.   •  Improve  isolaLon  of  hot   and  cold  aisles.   –  Reduce  fan  energy   –  Improve  air-­‐condi?oning   efficiency   –  Increase  cooling  capacity   49  Hot  aisle/cold  aisle  configuraLon   decreases  mixing  of  intake  &   exhaust  air,  promoLng  efficiency.     Source:  hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf  
  • 50. 50 Isolate  Cold  and  Hot  Aisles   Source:  hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf   70-80ºF vs. 45-55ºF 95-105ºF vs. 60-70ºF
  • 51. 51 Adding  Air  Curtains  for  Hot/Cold  IsolaLon   Photo  used  with  permission  from  the  NaLonal  Snow  and  Ice  Data  Center.  h?p://www.nrel.gov/docs/fy12osL/53939.pdf  
  • 52. Courtesy  of  Henry  Coles,  Lawrence  Berkeley  NaLonal  Laboratory  
  • 53. 53 Three  (3)  Cooling  Device  Categories   IT Equipment Rack cooling water rack containment SERVER FRONT 1  -­‐  Rack  Cooler   •  APC-­‐water   •  Knürr(CoolTherm)-­‐water   •  Knürr(CoolLoop)-­‐water   •  Rigal-­‐water   2  -­‐  Row  Cooler   •  APC(2*)-­‐water   •  Liebert-­‐refrigerant   IT Equipment Rack IT Equipment Rack IT Equipment Rack row containment cooling water cooling water SERVER FRONT 3  -­‐  Passive  Door  Cooler   •  IBM-­‐water   •  Vege/Coolcentric-­‐water   •  Liebert-­‐refrigerant   •  SUN-­‐refrigerant   SERVER FRONT IT Equipment Rack cooling water Courtesy  of  Henry  Coles,  Lawrence  Berkeley  Na0onal  Laboratory  
  • 54. 54 “Chill-­‐off  2”  EvaluaLon  of  Close-­‐ coupled  Cooling  SoluLons     Courtesy  of  Geoffrey  Bell  and  Henry  Coles,  Lawrence  Berkeley  Na0onal  Laboratory   less energy use
  • 55. 55 Cooling  Takeaways…   •  Use  a  central  plant  (e.g.  chiller/CRAHs)  vs.  CRAC  units   •  Use  centralized  controls  on  CRAC/CRAH  units  to  prevent   simultaneous  humidifying  and  dehumidifying.   •  Move  to  liquid  cooling  (room,  row,  rack,  chip)   •  Consider  VSDs  on  fans,  pumps,  chillers,  and  towers   •  Use  air-­‐  or  water-­‐side  free  cooling.   •  Expand  humidity  range  and  improve  humidity  control  (or   disconnect).