Beating the Heat in IT

IT managers discuss the pros and cons of cooling technologies

June 30, 2007

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

For many people, summer is a time to spend outdoors and enjoying the weather, but for IT managers, soaring temperatures underline the need for efficient cooling.

More vendors are jumping on the power and cooling bandwagon, including American Power Conversion, CA, Egenera, HP, IBM, Knurr, and Liebert (which is part of Emerson Network Power). Users face a bewildering array of technologies for keeping the heat down in their data centers, from water-based technologies to purpose-built racks for controlling airflow. (See The Green Monster, The Big Chill, IBM Chills Out, APC Helps Sisters of Mercy, HP Boasts Cooler BladeCenter, and HP Cuts Power & Cooling.)

As the thermometer pushes toward 100 degrees Fahrenheit, the effect on server and storage devices can be significant, according to Jon Drake, director of technologies at the Legacy Bank of Texas. "For every 18 degrees over 70, you increase your failure rate by 25 percent," he says, explaining that equipment processors are typically 10 to 15 degrees hotter than the outside air temperature.

The Plano, Texas-based bank uses a specialized air-conditioning device from American Power Conversion within its data center to put a lid on its heat production. Sitting within a row of racks, NetworkAIR uses a cooling technique called "hot aisle/cold aisle" that draws warm air from the space between two rows of servers, cools it, and then pumps it into the designated cold aisle. "Once you reach a certain size, I believe that you have to [have a specialized cooling] system," says Drake, who has about 60 servers in his data center.

A recent report from the U.K.'s National Computing Center revealed that 86 percent of users cite cooling as a major storage concern, but warned that space is also fast becoming a major challenge for many firms. (See Users Voice Cooling Concerns.)The resulting challenge affects lots of organizations, such as the Southern Ohio Medical Center (SOMC) in Portsmouth, Ohio. "We had to shoe-horn a data center into a small space, so the volume of air is small," explains Howard Stuart, the Center's radiology information systems manager, adding that his data center is only 30 square feet.

The exec told Byte and Switch that he opted for American Power Conversion's InfraStruXure system, which encloses data center racks to more effectively separate hot and cold air. (See Medical Center Deploys APC.) "For us, it was cheaper than building another data center."

The challenges posed by cooling are the same the world over. Sanzio Bassini, director of the systems and technology department at Italian supercomputing site Cineca, admits that cooling, and the energy needed for it, represent a major challenge. (See Cineca Picks Acronis .) "Our datacenter uses, at this point, something like two and a half megawatts of power," he says.

With 1,300 IBM BladeCenter devices packed into his data center, heat production is certainly on Bassini's radar, even if he is unmoved by some of the recent advances in cooling technology from the likes of IBM, HP, and Egenera. (See Blades Still Too Hot.) "We didn't decide on water cooling systems," he says. "[And] we didnt adopt exotic technology based on closed rooms and closed tunnels to keep hotspots in a certain manner."

Instead, Cineca uses what Bassini describes as "a general cooling system," with air coming up from below a raised floor to cool his servers and storage. "The reason is that it's more flexible," he says, explaining that his high performance computing systems only have a three-and-a-half-year lifespan, something that would make purpose-built water or airflow cooling systems an expensive option. (See IBM Unveils Cool Blue, Go With the Flow, and Data Center Heat Wave.)Another user eschewing specialized cooling products is Karl Lewis, storage administrator for the University of Michigan's engineering department. "We use traditional data center cooling, nothing fancy," he says, explaining that the department uses a similar raised-floor approach to Cineca. "It's just traditional airflow intake from under the floor."

Lewis admits that, with between 70 and 80 servers, a couple of OnStor NAS boxes, and some tape devices, the department's data center is not exactly bursting at the seams. "We don't do any high-density computing," he explains, adding that his server and storage racks are only half-populated in an attempt to reduce heat.

— James Rogers, Senior Editor Byte and Switch

  • American Power Conversion Corp. (APC) (Nasdaq: APCC)

  • CA Inc. (NYSE: CA)

  • Egenera Inc.

  • Hewlett-Packard Co. (NYSE: HPQ)

  • IBM Corp. (NYSE: IBM)

  • Knurr Inc.

  • ONStor Inc.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights