Data Center Best Practices

Leading-edge operators and consultants share their tips on building ultraefficient, ultrasecure, and ultrareliable facilities.

March 1, 2008

17 Min Read
Network Computing logo

There are data centers, and then there are data centers. The first kind ranges from the overheated, wire-tangled, cramped closets that sometimes also host cleaning supplies to the more standard glass-house variety of years past. The second kind--and the topic of this article--cool with winter air, run on solar power, automatically provision servers without human involvement, and can't be infiltrated even if the attacker is driving a Mack truck full-throttle through the front gate.InformationWeek Reports

These "badass" data centers--energy efficient, automated, hypersecure--are held up as models of innovation today, but their technologies and methodologies could become standard fare tomorrow.

Everything at Equinix has been thought through for security

Rhode Island's Bryant University sees its fair share of snow and cold weather. And all that cold outside air is perfect to chill the liquid that cools the university's new server room in the basement of the John H. Chafee Center for International Business. It's just one way that Bryant's IT department is saving 20% to 30% on power consumption compared with just a year ago. "We've come from the dark ages to the forefront," says Art Gloster, Bryant's VP of IT for the last five years.

Before a massive overhaul completed in April, the university had four "data centers" scattered across campus, including server racks stuffed into closets with little concern for backup and no thought to efficiency. Now Bryant's consolidated, virtualized, reconfigured, blade-based, and heavily automated data center is one of the first examples of IBM's young green data center initiative.IBM practices what it preaches, spending $79 million on its own green data center in Boulder, Colo. It spends $10 million a month on energy for all its data centers and hopes to keep the same environmental footprint through massive data center expansions.

Microsoft and Google also are putting heavy emphasis on environmental and energy concerns in building out their massive data centers, some of which cost upward of $500 million. Energy consumption at Microsoft's new data center in Ireland is half that of similar-sized data centers with similar configurations, says Rob Bernard, Microsoft's new chief environmental officer. "We looked at every aspect of where to site the building, how to drive more efficiency in the data centers," Bernard says. Google, while tight lipped on details, is careful to locate its data centers near clean power sources.

DIG DEEPER

Data Center
Power Struggle

Design a modular data center that will future-proof your investment.

Purchase this Analytics report

>> See all our Analytics <<

But for Bryant, it's more than cheap or even clean power. It used to be that most power outages would shut the network down. The last power outage before Bryant opened its new data center took out the air conditioning, but not the servers themselves. Bryant was forced to use portable air conditioners just to get basic apps up and running. American Power Conversion alarms that register poor power or problematic temperatures went off all the time, but the university could do nothing about them. "There was no air conditioning distribution system in there," says Gloster. "It was all just one big pot, coming out of one duct."

Now the data center has a closed-loop cooling system using ethylene glycol, chilled by outside air when it's cold enough. On a cold December day, the giant APC chiller sits encased in snow, cooling the ethylene glycol. Rich Bertone, a Bryant technical analyst, estimates a 30% to 40% savings on cooling costs compared with more common refrigerant-based air conditioning.
THINK COOLThat's not the only innovative way to keep things cool. The University of California at Berkeley is considering using a huge defunct particle-accelerator chamber as a reservoir for coolant to essentially "store cold" until it's needed, says IBM VP Steven Sams, head of the company's data center strategy consultancy. Google says it's a strong believer in much cheaper evaporative cooling (a.k.a. swamp coolers) in warm, dry climates. Co-location vendor Equinix has a data center that makes ice overnight when power is cheaper and uses the melting ice to cool during the day.

Another unusual step Bryant has taken is to build on grade, with no raised flooring. The university did it because of space constraints, but Forrester Research analyst James Staten says some of the largest tech companies are turning that building design into a trend for other reasons. "The new cooling systems really work better when they drop cold air down from the aisle rather than blow it up from below," Staten says. "This is true if you're using air cooling or the new liquid cooling." That's a controversial view: IBM's Sams says getting rid of raised floors isn't efficient at scale.

HP's cell architecture isolates cooling needs

Consolidation was one of the main goals of Bryant's data center upgrade. The initial strategy was to get everything in one place so the university could deliver on a backup strategy during outages. Little thought was given to going green. However, as Bryant worked with IBM and APC engineers on the data center, going through four designs before settling on this one, saving energy emerged as a value proposition.

The final location was the right size, near an electrical substation at the back of the campus, in a lightly traveled area, which was good for the data center's physical security. Proximity to an electrical substation was key. "The farther away the power supply, the less efficient the data center," Bertone says. Microsoft and Equinix both have data centers with their own substation.

AISO.net, a small Internet hosting company that hosted the Live Earth concert series online, went for a cleaner power supply when it converted to solar power in 2001. An array of 120 solar panels sits on the company's 1-1/3-acre property. "We saw that our costs were going to continually go up and said this was probably the right thing to do, too," says CTO Phil Nail.

It cost about $100,000 to outfit the company with solar panels, but Nail says AISO has made that money back and has continued to look for ways to cut its energy use even further, virtualizing almost every app running in its data center and pulling in cold air whenever the outside temperature drops below 50 degrees.

Bryant is in the midst of deploying software that automatically manages server clock speed to lower power consumption, something that IBM co-developed with APC. Right now, APC technologies monitor and control fan speed, power level used at each outlet, cooling capacity, temperature, and humidity. Power is distributed to server blades as they need it.When power goes out, Bryant no longer has to take the data center offline or bring out the portable air conditioning. A room near the data center hosts an APC Intelligent Transfer Switch that knows when to switch power resources to batteries, which can run the whole system for 20 minutes. If power quality falls out of line, the data center automatically switches to generator power and pages Bertone. The generator can run for two days on a full tank of diesel.

Other companies, including co-location provider Terremark Worldwide, are doing away with battery backup in some places, opting for flywheels. These heavy spinning wheels from companies such as Active Power can power equipment just long enough for generators to start.

Since Bryant doesn't have to constantly worry about data center reliability, it can focus on new strategic initiatives. It's working with Cisco Systems, Nokia, and T-Mobile to set up dual-band Wi-Fi and cellular service that will let students make free phone calls on campus. The university also is home to Cisco's IPICS communication center, linking emergency responders in Rhode Island and Connecticut; is moving toward providing students with unified communications and IPTV; and is in talks with an accounting software company to host apps in the Bryant data center to bring in extra cash.

Efficiency Takeaway

>> Energy sources are everywhere, from ambient cooling via external air to solar panels. Payback on such projects may be only a few years.

>> Locating data centers near power sources or substations can mean a sweet deal from utility providers. Everything at Equinix has been thought through for security.

"Before we did this data center, it was the thing that kept me up at night. ... Now, we have more time to be innovative," says Rich Siedzik, director of computer and telecommunications services at Bryant. The university needed to move from an operational focus to a strategic one, and "the data center allowed us to do that," Siedzik says. With all those projects, Bryant is now considered one of the most wired campuses in the country.

Not that VP of IT Gloster is satisfied. He says Bryant can go much further to save energy; it recently had a call with IBM to discuss how the university could cut its power costs by another 50%.DIGITAL FORT KNOXESEquinix's Ashburn, Va., data centers are hard to find. But it's on purpose; it helps keep the bad guys out. The industrial park where they're located is devoid of any logos, and visitors and employees enter through unmarked doors where they're greeted by a biometrically locked door. This is the first of five biometric systems visitors will have to pass before gaining physical access to servers. Each is a Schlage HandKey, which uses 94 independent 3-D measures of hand geometry.Everything at Equinix seems to have been thought through for security. The floor is a concrete slab, partially to cut down on wires running where eyes can track them and hands can splice them. The walls are painted black, partially to increase customer anonymity by making the environment darker. Security systems in Equinix's data center have their own keyed-entry power supplies and backups.

For Terremark, too, security is part of its value proposition. It recently built several 50,000-square-foot buildings on a new 30-acre campus in Culpepper, Va., using a tiered physical security approach that takes into consideration every layer from outside the fences to the machines inside.

For its most sensitive systems, there are seven tiers of physical security a person must pass before physically touching the machines. Those include berms of dirt along the perimeter of the property, gates, fences, identity cards, guards, and biometrics.

Among Terremark's high-tech physical security measures are machines that measure hand geometry against a database of credentialed employees and an IP camera system that acts as an electronic tripwire. If the cordon is breached, the camera that caught the breach immediately pops up on a bank of security monitors. That system is designed to recognize faces, but Terremark hasn't yet unlocked that capability.

Some of what Terremark says are its best security measures are the lowest tech. "Just by putting a gutter or a gully in front of a berm, that doesn't cost anything, but it's extremely effective," says Ben Stewart, Terremark's senior VP for facility engineering. After the ditches and hills, there are gates and fencing rated at K-4 strength, strong enough to stop a truck moving at 35 mph.The choice of the Culpepper site was no accident. The company's other main data center is in Miami, but federal government customers didn't like that less-secure urban environment. In making the move, Terremark took not only Culpepper's rural location into consideration, but also the fact that it's outside the nuclear blast zone of Washington, D.C., so any major nuclear attack wouldn't take out precious data.

Banks are notoriously skittish and secretive about their security strategies. But they, too, are digital Fort Knoxes at work, with physical security taken to extremes. Deutsche Bank has located two of its data centers underground in the Black Forest in Germany. That's typical, says IBM's Sams. "In the United States, you'll see them in unlabeled backwoods locations with double-string barbed-wire enclosures," he says.

Glen Sharlun, VP of customer insight at security vendor ArcSight, isn't your typical salesman. He has seen his fair share of security operations as a former commanding officer for network security at the U.S. Marine Corps. Sharlun can't talk about the classified systems in place there, but his experience and insight have given him access to some advanced deployments and enlightened policies.

Security Takeaway

>> Defense in layers is still the best approach to security, and not all layers, such as strong fences, need to be high tech.

>> Internal threats often pose the greatest risks. Implement systems that authenticate users and track their actions.

Sharlun believes the internal threat, despite increasing press over the last few years, is still poorly understood. He tells the story of a quality-assurance engineer who stole trade secrets from a mobile software company. The company reacted by implementing a biometric system to access software and a system that creates a personalized digital watermark for each employee and imprints that watermark into the software whenever anyone does anything with it.

The user's location is another important factor. "An outsider somewhere in Uzbekistan may look like an insider because he has authenticated access," Sharlun says. Recently, a state government computer security executive told Sharlun about how the state takes firewall logs of IP addresses accessing the system that can determine the general location of someone logging in, and combines them with Google Maps to plot geographically where people accessing the system are coming from. "These types of things aren't that hard to do but can be really insightful," he says.MAKE IT AUTOMATICThe notion of data center automation has been around for years, but there's still no such thing as a data center that runs with no human intervention. Not that some companies aren't trying.Even reference customers of innovative vendors are just starting with automation. SunTrust Banks started using BladeLogic six months ago for automated server provisioning and is beginning to use it for automated compliance and management. The infrastructure is fully rolled out, but the automation is in relatively early stages. SunTrust was driven to BladeLogic mainly for cost savings and a need to be more efficient with its time.

Before installing BladeLogic, SunTrust had to manually install all of its server operating systems and manually lay down third-party applications on top of those systems. Now, says Dexter Oliver, VP of distributed server engineering, "you rack a server, you push a button, and then the next button lays on the third-party applications."

Even with BladeLogic in place, there's no way to make sure the installed configurations stay intact once the systems go live, other than through periodic human monitoring. SunTrust's next step is to take products such as WebSphere and WebLogic that have specific configuration needs, build templates for those applications, and provision and maintain those configurations automatically, whether the apps are installed on virtual or physical machines, Windows or Linux.

All that hands-off work frees up people to work on strategic projects, Oliver says. He would like to automate more but says that in order to get there, technologies and methodologies that can integrate the various tools in SunTrust's management toolbox still need to be created. "You've got configuration management databases, you've got this tool that does patching, this tool does monitoring," he says. "That orchestration piece is something that will need to be further developed."

Plenty of others would like to go as far as SunTrust plans to go. Communications technology company Mitel Networks is one. It runs Hewlett-Packard technology that can do scenario testing on workloads for disaster recovery. Mitel would like to proactively manage those workloads. "We originally invested in workload management just for failover scenarios, but we're now looking at it as a way to set up policies and then manage workload automatically," says David Grant, a Mitel data center manager. "We're prepared to invest some time and effort into this because we see our future in the IT space being tied in with this."Mitel bought into a vision HP calls the Adaptive Enterprise. The company is pushing toward a heavily virtualized environment, aiming to evolve the data center so that "workloads will run where they need to run." It's already easy for Mitel to replicate servers and move virtual workloads around, but the next step is to automate that process.

Server and application provisioning isn't the only automation happening these days, as runtime automation is still going strong. One national insurance company is automating as many processes as it can using Opalis Integration Server. It started automating month-end processes, then moved into early detection and repair of problems, and collection and central storage of audit logs. The month-end processing went from taking 30 people two weeks to do to taking five people a total of three days.

"Our data centers are pretty dark," says Larry Dusanic, the company's director of IT. The insurer doesn't even have a full-time engineer working in its main data center in southern Nevada. Run-book automation is "the tool to glue everything together," from SQL Server, MySQL, and Oracle to Internet Information Server and Apache, he says.

Though Dusanic's organization uses run-book automation to integrate its systems and automate processes, the company still relies on experienced engineers to write scripts to make it all happen. "You need to take the time up front to really look at something," he says. Common processes might involve 30 interdependent tasks, and it can take weeks to create a proper automated script.

One of the more interesting scenarios Dusanic has been able to accomplish fixes a problem Citrix Systems has with printing large files. The insurance company prints thousands of pages periodically as part of its loss accounting, and the application that deals with them is distributed via Citrix. However, large print jobs run from Citrix can kill print servers, printers, and the application itself.

Automation Takeaway

>> Automation isn't fully baked yet, but it's getting there. Provisioning and patch automation should be part of any data center.

>> Run-book automation and virtualization can be used to improve operations and security, and to provide audit trails.

Now, whenever a print job of more than 20 pages is executed from Citrix, a text file is created to say who requested the job, where it's being printed, and what's being printed. The text file is placed in a file share that Opalis monitors. Opalis then inputs the information into a database and load balances the job across printers. Once the task is complete, a notification is sent to the print operator and the user who requested the job. Dusanic says the company could easily make it so that if CPU utilization on the print server gets to a certain threshold, the job would be moved to another server automatically. "If we had a custom solution to do this, it probably would have cost $100,000 end to end," he says.

No data center is perfect, and even the innovative few have their own foibles. For those companies that have already driven down the path of hyperefficient, secure, or automated data centers, there's a lot more to be done. That's the mark of true innovators: never giving up the good fight to stay ahead and keep the competitive edge.

Continue to the sidebar:
How eBay Manages Its Data Centers

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights