WASHINGTON -- A new EPA report shows that data centers in the United States have the potential to save up to $4 billion in annual electricity costs through more energy efficient equipment and operations, and the broad implementation of best management practices. The "Report to Congress on Server and Data Center Energy Efficiency" recommends priority efficiency opportunities and policies that can also lead to additional savings using state-of-the-art technologies and operations.
Data centers are facilities that contain IT equipment (computing, networking and data storage equipment), as well as power and cooling infrastructure. They are part of our critical national infrastructure, found in nearly every sector of the economy, including banking and financial services, media, manufacturing, transportation, education, health care and government.
Findings from the report include:
- Data centers consumed about 60 billion kilowatt-hours (kWh) in 2006, roughly 1.5 percent of total U.S. electricity consumption.
- The energy consumption of servers and data centers has doubled in the past five years and is expected to almost double again in the next five years to more than 100 billion kWh, costing about $7.4 billion annually.
- Federal servers and data centers alone account for approximately 6 billion kWh (10 percent) of this electricity use, at a total electricity cost of about $450 million per year.
- Existing technologies and strategies could reduce typical server energy use by an estimated 25 percent, with even greater energy savings possible with advanced technologies.
As the U.S. economy increasingly shifts from paper-based to digital information management, data centers have become a vital part of business, communication, academic, and governmental systems. Over the last five years the increase in use of these systems, and the power and cooling infrastructure that supports them, have doubled energy use, increased greenhouse gas emissions and raised concerns about power grid reliability.