Data centers aren't usually front page news, but the Sunday New York Times put them center stage in an extensive article criticizing the infrastructure that supports the Web for inefficient practices that waste energy and pollute the environment. The article also questions Silicon Valley's green image, saying "...many data centers appear on the state government's Toxic Air Contaminant Inventory, a roster of the area's top stationary diesel polluters."
It's no secret that data centers use mass quantities of power to keep servers humming and Web services available, but the Times article offers some staggering numbers: It places worldwide consumption of electricity at about 30 billion watts, or the output of around 30 nuclear power plants. It also estimates that U.S. data centers account for one-quarter to one-third of that electricity usage. According to an analysis by McKinsey & Co., which the Times requested, on average data centers use only 6% to 8% of their electricity for computation; most of the rest goes to keep servers at the ready for traffic surges.
In the meantime, many data centers also have diesel generators in place to keep the power running in the event of an outage on the electricity grid. But, as the Times reports, companies have run into trouble around environmental permits. It cites one case of Amazon being fined more than $260,000 for failing to obtain required permits for diesel generators at a location in Virginia.
The requirement for always-on availability is what drives energy consumption--and it isn't limited to the massive server farms run by Google, Facebook, Amazon and other Internet titans. Reliability and availability was the top application infrastructure requirement in the 2012 InformationWeek State of the Data Center survey, cited by 74% of respondents. In other words, enterprise data center operators are under just as much pressure as cloud providers to keep applications up and running.
Server virtualization is believed to provide some relief to the consumption problem because one physical machine can do the job of two or three or more. Reducing the number of actual servers can cut power and cooling requirements and help IT run more efficient data centers. However, energy efficiency isn't top of mind, according to our survey. When we asked respondents to rate the most compelling virtualization drivers on a scale of 1 to 10, with 10 being least compelling, power savings ranked seventh.
Data growth also exacerbates energy consumption, according to the Times article, with millions of users piling ever-more photos, videos, emails and status updates into massive online repositories. Storage growth also plays a role in enterprise data centers--in our survey, it was listed fourth on a list of 17 trends affecting data center operations.
The data center industry recognizes that efficiency is an issue it must grapple with. One effort is the Open Compute Project, which Facebook launched in 2011 to share its data center designs and specifications with the Internet community at large. One goal of the project is to promote the design of more energy-efficient data centers. Though initially targeted at high-end data center operators, its designs have the potential to be implemented in the enterprise. This month, the Open Compute Project (OCP) released the 1.0 specification of Open Rack, which it touts as having better airflow and higher energy efficiency than standard rack designs. The Open Compute Project also has designs and specifications for server chassis and motherboards. Network Computing has a photo gallery from an OCP event last year. Drew is formerly editor of Network Computing and currently director of content and community for Interop. View Full Bio