Summer Storage Survival

Need tips on keeping data gear cool in record heat? Try these

August 3, 2006

5 Min Read
Network Computing logo

As the Weather Channel shows, when temperatures in New York City match the heat in Phoenix, weird things can happen. And for IT pros, power outages, thunderstorms, and sauna-like atmospheric conditions are a potentially powerful threat to corporate data.

Ironically, some of the measures users take to cope with burgeoning data contribute to the danger. Blade servers and dense storage arrays can soak up twice the amount of power and cool air as their predecessors. Indeed, it's precisely these conditions that have become a focus for new development by various suppliers.

This week, for instance, IBM introduced new "cool Blue" software to manage data center power consumption and thermal conditions along with its AMD Opteron-based servers. (See IBM Bolsters Blade Strategy.) And HP last year added a Power Regulator feature to its ProLiant servers that automatically monitors and controls power by policy or application activity. HP also claims its BladeSystem c-Class servers are equipped with fans that consume less power while delivering more cooling; and the vendor touts a "self-cooled rack" called the HP Modular Cooling System for equipment.

These are just the tip of an iceberg of supplier claims. RAID controllers from Digi-Data now take "power naps" to conserve resources, and Intel says it's Dual-Core Intel Xeon Processor 5100 uses 40 percent less power. (See Digi-Data Takes Power Nap.) VTL vendor Copan, meantime, has built its reputation in part on the ability of its MAID (massive array of idle disks) systems to power SATA drives up and down to save energy.

There is, too, a ton of environmental gear from companies like American Power Conversion, which specialize in the requirements of data centers with blade servers and dense storage systems. And if you're up for moving, new data center space built with advanced cooling and power features is available from the likes of Digital Realty Trust Inc. and 365 Main. (See 365 Main and Digital Realty Buys Datacenters.)Clearly, new technology purports to help fight issues of heat and power. And at least one analyst thinks it's worth a switch, so to speak. "[Identify] what devices are consuming the most power and generating the most heat. Look to see if these devices can be replaced with new devices that require less power and cooling," suggests Greg Schulz, founder and senior analyst of the StorageIO consultancy. And he's not talking just about storage systems: "[S]ome older Fibre Channel switches and directors can generate significant amounts of heat," he notes. Summer heat waves may be a prime time to refresh the technology.

Still, replacing equipment can be a Catch-22, Schulz notes, since some newer storage kit may be smaller but is often denser than what came before, requiring special planning for cooling and power.

So is moving or making a forklift upgrade the only way to survive sizzling summers? Not always. There are other things besides replacing equipment that users can do before buying all-new. Here is a handful of tips:

  • Shut off the lights if you don't need them. Two IT managers suggested this in a random survey. Another data center manager for a large airline suggests "powering down any equipment that is not required for the production environment."

  • Check your backup. We don't mean storage backup, for once. "[F]irst and foremost, make sure your contingency or alternate power source, such as a generator, is in working condition and that you have adequate fuel supplies to endure a prolonged power outage," states Greg Schulz.

  • Don't over-cool. "Some people like to freeze their data centers, and they don't need to do that," says Justin Polazzo, network engineer with Allarus Technology management, an IT consulting and outsourcing firm based in New York City. As long as the ambient temperature is 75 to 80 degrees Fahrenheit, he says, and the heat coming off the top of the machines is about 85 degrees, it should be cool enough.

  • Make a disk swap. Some experts, including researchers at the University of Rochester, have suggested replacing server-class disks with laptop-class ones that require less power. Of course, this might interfere with your maintenance plan on commercial systems.

  • Disable automatic power management (APM) functions. According to Allarus's Polazzo, APM programs on servers can backfire if they automatically shut down power without consideration for attached workstations and other gear. In his opinion, it is better to use other forms of data center automation than the ones that come with APM routines.

  • Designate a high-density area and provide special cooling and power only to that area. American Power Conversion CTO Neil Rasmussen suggests in a white paper that when power requirements exceed 10 kilowatts per rack, it's time to isolate dense racks of equipment or blades in order to ensure that they can be cooled efficiently. Segregation helps to shorten the path between the cooling system and the dense rack, he maintains.

  • Check floors and other pathways for air. A whole science surrounds the issue of data center flooring, but some key tips emerge, such as making sure that there is adequate ventilation underneath and around equipment. (See The Big Chill.) This will ensure that air conditioning systems don't have to overwork to keep things cool.

  • Keep your cool. The dog days of summer can try the patience of saints, let alone data center managers. So make sure to stay ventilated and hydrated yourself.

Mary Jander, Site Editor, Byte and Switch

  • 365 Main Inc.

  • Advanced Micro Devices (NYSE: AMD)

  • American Power Conversion Corp. (APC) (Nasdaq: APCC)

  • Copan Systems Inc.

  • Digi-Data Corp.

  • Hewlett-Packard Co. (NYSE: HPQ)

  • IBM Corp. (NYSE: IBM)

  • Intel Corp. (Nasdaq: INTC)

  • The StorageIO Group

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights