Data Centers: Get 'Em Hotter and Wetter

Data centers can run hotter and with more humidity than traditionally thought, according to a recent report from The Green Grid. This will reduce energy consumption and cut operating costs, but long-held myths might stand in the way.

November 28, 2012

5 Min Read
Network Computing logo

Heat and humidity are bad for data centers, right? Maybe not as much as we thought. A recent white paper from the Green Grid, the same non-profit organization that first put forward the Power Use Efficiency (PUE) metric, explains the benefits of allowing data centers to run hotter and with more humidity than is common.

You don't just have to take the Green Grid's word for it: The white paper brings together and summarizes work by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) and other researchers. The report provides clear explanations of why hotter and wetter is something you should consider.

The Greed Grid and other organizations are pushing for wider temperature and humidity ranges because the less you cool your data center, the less energy you consume. And the less you have to humidify or dehumidify your data center, the less energy you consume. The bottom line is the potential for significant reductions in operating costs.

However, a number of myths have to be dispelled before we can expect to see the industry embrace hotter, wetter data centers. In particular is the myth that computers operate better at colder temperatures. Green Grid, ASHRAE and others aim to dispel this and other myths with research.

The ASHRAE Technical Committee 9.9 has had engineering standards for "Mission Critical Facilities, Technology Spaces, and Electronic Equipment" for more than a decade. For most data center operators with highly diverse types of equipment, these standards work because they cover all types of IT gear. In 2011, ASHRAE issued new information for "Thermal Guidelines for Data Processing Environments-Expanded Data Center Classes and Usage Guidance." If you like digging into the engineering details, I strongly recommend reading this document. If you just want the summary, stay with the Green Grid report.

Before the 2011 report (from 2004 to 2008), ASHRAE lowered the bottom end and raised the top end of the "recommended" temperature range by 2° C , to a low of 64.4° F and to a high of 80.6° F. The "allowable" temperature range stayed the same at 59°F to 89.6°F. So why didn't everyone change to higher temperatures in 2008? Because ASHREA didn't provide the research background in its report. Without strong evidence, skeptics just couldn't let go of their mythology. The TC9.9 2011 report provides much the same message as 2008, but shares the research for all to see. The recommendations have solid data to back them up.

The main issue with changing the environmental ranges for data centers is the impact on reliability. Research from ASHRAE and other organizations--including Los Alamos National Lab, the IEEE and ITHERM--shows some empirical evidence (correlation but not verified causality) of a slight increase in failure rates at higher temperatures.

Wait! Any increase in failures is bad, right? That's where the science comes in. Green Grid points out that the research shows very small increases in failure rates. In fact, the cost for reduced energy consumption obliterates the replacement costs for these failures.

Take the hypothetical example of a data center with 1,000 servers. If the assumed baseline failure rate is 10 servers at 68° F, at 81.5°F the average correlated result would be 3 more server failures per year. If we assume a $10k/server cost to replace, this would be an increase of $30k per year, although this doesn't account for potential business impact of server failures.

So what are the energy savings for increasing the operating temperature of the data center from 68° F to 81.5°F? This is highly dependent on your geography, data center design, and efficiencies, but if we take some industry averages, they can give a rough order of magnitude for illustration purposes.

If we assume a data center of 500kW load and electricity rates of 15 cents, the annual bill for energy would be approximately $650,000. A PUE of 2.0 would mean that $325,000 of the bill is for the cooling and power distribution.

Green Grid and others estimate a potential savings of 20% to 50% in energy costs in the mechanical systems by taking advantage of the wider operating temperatures. If we take the 20% savings, that would be $65,000, more than double my exaggerated cost for three servers.

That comparison is only on a cost basis. Green Grid points out that a 20% energy savings is also a 20% reduction in carbon footprint. There are other benefits, as well.

Next Page: Challenges of Increased HeatOf course, this isn't all puppy dogs and rainbows. There are challenges with pushing your thermostat up to 81.5°F. Many data centers don't have air or water economizers in the design to take full advantage of outside temperatures. This means you might not reduce your mechanical cooling as much as predicted.

In addition, many data centers don't have contained hot or cold aisles, and raising your temperature without containment can lead to hot spots that climb above 81.5°F. Finally, most data centers don't have the humidity controls systems to control to dew point at these higher temperatures. Controlling relative humidity (instead of dew point) at 81.5°F is a big mistake and will take you way outside the ASHRAE standards for how wet the air should be.

That said, these challenges don't invalidate Green Grid's recommendations or ASHRAE's new standards, though they might limit their application in existing facilities. However, if the industry is ever going to get away from poor data center designs and mythology, new facilities will need to factor in this scientific-based approach. Green Grid's report provides thought leadership for the data center industry and is a must-read for IT and facilities pros alike.

Ken Miller is data center architect with the IT Infrastructure and Operation Services division of Midwest ISO, developing mission-critical facilities.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights