Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Facebook's Futuristic Data Center: Inside Tour


  • It rises out of the rocky central Oregon desert like a fortress, albeit an inviting one: Facebook's first data center complex, located on a mesa on the outskirts of Prineville, population 10,000.

    This is what the data center of the future looks like: massive complexes built near cheap power supplies in climatically favorable settings, designed to power huge Internet-based applications with minimal staff.

    With Apple building a data center complex next door to Facebook's facility, Prineville, Ore., might one day be known as one of the few mega-data center concentrations in the world.

    [ See our related feature story, Facebook's Data Center: Where Likes Live, for more details on the power-saving technologies at work in Prineville. ]

    This high desert seldom gets warmer than 86 degrees, even at the peak of summer, and nights are cool most of the year. The modern design of Facebook's building takes advantage of the climate by using evaporation as its sole cooling process. Servers run in a warm 80-degree atmosphere; on warm days, the air used for cooling moves closer to 85 degrees, according to Facebook data center managers. The Prineville facility currently consists of two massive 330,000-square-foot data halls, with room for a third on the site. A similar facility is already operating in Lulea, Sweden, and a third is currently under construction in Forest City, N.C. Like Sweden, central Oregon uses cheap hydroelectric power. Large dam projects such as Bonneville pump power into distribution lines to reach markets in California and Nevada.

    Since April 2011, Prineville has been Exhibit A for the Open Compute Project, when Facebook, unlike other cloud data center builders, chose to share its data center and server design plans. Frank Frankovski, VP of hardware design and supply chain operations at Facebook, and others have carried the message in a series of Open Compute summits. Project managers at data center builders such as DPR Construction say Open Compute is having an impact on the industry as a whole. Companies such as Sun Microsystems, Google and Microsoft also have built efficient data centers and have boasted of their Power Unit Efficiency (PUE) rating. But no company has yet been able to top Prineville's PUE of 1.06 in Building Two of the complex.

    Facebook has had a big impact on the Prineville area, increasing its tax base, providing construction and data center jobs, and donating directly to the community.

    Take a look at what we learned and saw on a recent tour.


  • The Prineville site covers more than 100 acres and includes two data centers, or "data halls." The two are housed in identical buildings, each 330,000 square feet in size. More than 1,560 tons of steel, 14,254 cubic yards of concrete and 950 miles of wire and cable went into each building.

    Building Two (right) boasts some equipment advancements. Next to Building One (left) are banks of solar panels, angled toward the site's southern exposure, which produce electricity for the offices of both buildings. The equipment, however, runs off the electrical grid as the solar panels can't produce enough reliable power to run the entire facility.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • Just beyond the security gatehouse on SW Connect Way, flags fly near the entrance of Building Two. Various barriers block vehicles from getting too close to the glass-enclosed occupied areas of the building and massive concrete walls protect equipment areas. A thick, four-foot-high stone wall encased in wire mesh blocks access to the office space of the building. The building is 1,200 feet long and would stand 81 stories high if tipped on one end.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • Data center manager Joshua Crass discusses the layout of servers in one of the suites of Building Two. Each building is divided into four server suites, with servers arranged along open cold aisles. The back sides of the servers face an enclosed "hot aisle," where heat is collected and siphoned to chambers in the building's top floor. From there it is either vented out of the building or mixed with cool outside air to be recirculated through the server areas.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • The second-generation hardware conceived by the Open Compute project, dubbed Winterfell, is being installed in Building Two of Facebook's Prineville complex. This image shows three of the elongated Winterfell servers side by side in a 2U tray that can slide in and out of the rack for servicing. The front is open for better air passage. The components are aligned in parallel for air flow, and two small fans at the rear of the motherboard draw cooling air across the server. The warmed air then exits into the hot aisle, where it's collected and expelled from the building or remixed with chilly outside air.

  • High walls of filters (left) and turquoise-colored filters (right) remove dust, insects and particles from the building's ambient air. With no compressor-based chillers, the Prineville data centers rely on outside air for all cooling operations. Depending on the time of year, some warm inside air is mixed with ambient air to achieve the right temperature before it is injected into server areas and moved across heated components by two small fans on each server. The filters have corrugated "paper" walls, similar to the circular air filters that used to sit under the hood of an automobile to filter air entering the carburetor.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • This is Munters media, an absorbent cellulous material that's used to cool and humidify warm desert air brought into the chamber through vanes and filters. Air passes through the material and is cooled by a small amount of water. The air cools by 10 to 12 degrees as it uses energy to absorb water vapor. Water is collected and sent to an ultraviolet exposure chamber to kill bacteria and other living organisms, then recirculated through the Munters Media. Humidification is necessary in the data center because the dry desert air, particularly during winter, encourages static electrical discharges that can be harmful to data center equipment.

    Facebook's Prineville facility won the Engineering News-Record's award for best green building in 2011, due in no small part to its energy and water-conserving features in a region where energy is plentiful but water is scarce.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • In the chamber where filtered air has been cooled down, Crass peers down a 9-foot-by-9-foot hole in the floor that allows air to flow down a large walled passageway back to the data center servers. It hits a deflector plate above the cold aisle and spreads out in a passive air distribution system. A more traditional method of air conditioning the data center would require more energy, because in order for cooler air to reach the top of the server rack it must be much colder at floor level. By getting air to flow down over the rack, it can be warmer at the start of its journey.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • Crass inspects one of the exhaust fans sitting idle in the warm air exhaust chamber above the server suites of the Prineville facility. Warm air is siphoned into the chamber and exits the building on warm summer days. With snow flurries in the air on Feb. 20, the day of our visit, only one or two of the fans were at work. Most of the excess heat that day was being rerouted to heat populated areas of the building such as the reception area and cafeteria, offices, break rooms and meeting rooms. The building's air flow management system constantly monitors conditions and automatically routes air as needed.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts


  • From the roof of Building 2, a 62,000-square-foot "mini-building," which will serve as a cold storage center, is visible. It will house a concentration of disk arrays with a few running servers to store Facebook's less-frequently accessed pictures and content. This approach means it will take a few milliseconds longer to retrieve pictures you haven't looked at recently, but it's more cost-effective for Facebook, which manages 350 million picture uploads per day -- a photo storehouse that grows at the rate of 7 PB a month.

    RECOMMENDED READING:

    Facebook Open Compute Project Shapes Big Data Hardware

    5 Data Center Trends For 2013

    Meet Facebook's Graph Search Tool

    Facebook's 2012 Highs And Lows

    Why Your Data Center Costs Will Drop

    Cisco's Internet Of Everything Plan: 4 Facts