Data centers

10:00 AM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

Facebook's Data Center: Where Likes Live

Welcome to the Oregon high desert, where Facebook stores all of your likes while pursuing data center energy efficiency on a new scale. Coming soon to the neighborhood: Apple.

The small operating staff is seldom discomforted. All network connections and servicing of the equipment is done from the front of the racks -- the cold aisle. None is done from the back, the hot aisle. During our visit on Feb. 20, the hot aisle was about 72 or 74 degrees, mainly because the temperature outside was 30, with snowflakes in the air. In the fan room on the roof, most of the big exhaust fans were idle and the surplus heat was going into heating the cafeteria, office space and meeting rooms.

Facebook has applied for a patent on how it steps down 12,500-volt power from a grid substation to the server racks. It brings power at the level of 480 volts into the data center to a reactor power panel at each cold aisle of servers. It delivers 240-volt power to three banks of power supplies on a server rack. The process eliminates one transformer step, an energy-saving move since some power is lost with each step down in the conversion process. Most enterprises lose about 25% of the power they bring into the data center through these steps; Facebook loses 7%.

Not every idea implemented at Prineville was invented by Facebook. Facebook executives give some credit to Google for the idea of a distributed power supply unit with battery on the server, as opposed to operating at a central point where power feeds into the data center. The difference is that the conversion from alternating current to direct, and back to alternating (which cost 5% to 8% of power at predecessor data centers) was cut to a much smaller percentage at Google data centers. The conversion was required to ensure that battery backup was fully charged and ready to go the instant the grid supply failed. Google found a way around that power penalty by distributing battery backup to each server.

But Facebook is happy to take credit for its own innovations as well. And perhaps more importantly, it's publishing the details of its power conserving servers in the Open Compute Project and opening up its data centers for wide inspection.

Crass, at an athletic 36 years old, garbed in a Facebook hoodie, jeans and sneakers, seems like he would be as much at home posting his latest achievements on the surfboard or ski board to a Facebook page as managing its massive complex day after day. But he says it's the job he was cut out for.

Much of the real work of managing the facility is done by software regulating the air flow and monitoring the systems. The servers themselves, he said, are governed by a system that can invoke auto-remediation if a server for any reason stalls.

"Maybe a server is really wedged and needs a reboot. The remediation system can detect if the image is corrupted on the drive and can't reboot. Then it will re-image the machine" with a fresh copy, he explained. No technician rushes down the cold aisle to find the stalled server and push a reboot button. The remediation system "just solves most problems," he said.

Crass isn't allowed to offer a count on the total number of servers currently running, so he explains "tens of thousands," when asked. For purposes of comparison, Microsoft built a 500,000-square-foot facility outside Chicago that houses 300,000 servers. Reports on the capital costs for one building at Prineville show a total expense of $210 million, but that's not a total for the fully equipped building. Microsoft and Google filings for large data centers in Dublin, Ireland, show a cost between $300 to $450 million.

The Prineville complex sits in the middle of a power grid that pipes hydroelectric power from the Bonneville and other dams in the Northwest to California and Nevada. Visitors pass under a giant utility right of way that consists of three sets of towers not far from the Prineville site.

The mega data center is a new order of compute power, operated with a degree of automation and efficiency that few enterprise data centers can hope to rival. For Crass, it's the place he wants to be. He and his wife lived in Portland before he took a job on a project in Iowa. Given the option to take on Prineville, he jumped at it. He knew it would be an implementation of the Open Compute architecture and a working test bed for its major concepts.

"I love it. It's an amazing place to work. It's open to everybody. You're able to be here and walk through it and take pictures," he noted at the end of the tour. Everybody likes to be running something cool and letting the world know about it, he said.

The Prineville data center incorporates the latest cloud server hardware, a huge picture storage service and a lean staff, Crass points out. For at least a while, this complex sports the best energy efficiency rating of any major data center in the world, and the lessons being learned here will reverberate through data center design into the future.

Attend Interop Las Vegas May 6-10 and learn the emerging trends in information risk management and security. Use Priority Code MPIWK by March 22 to save an additional $200 off the early bird discount on All Access and Conference Passes. Join us in Las Vegas for access to 125+ workshops and conference classes, 300+ exhibiting companies, and the latest technology. Register today!

Previous
3 of 3
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
timwessels
50%
50%
timwessels,
User Rank: Apprentice
3/7/2013 | 3:11:39 AM
re: Facebook's Data Center: Where Likes Live
Well, another good article by Mr. Babcock, who is one of the best tech writers in the business. If you want to see the inside of facebook's Prineville, Oregon, data center, there are a number of videos about it on YouTube. http://tinyurl.com/cg9rmgy
John Foley
50%
50%
John Foley,
User Rank: Apprentice
3/6/2013 | 6:15:31 PM
re: Facebook's Data Center: Where Likes Live
If you like data centers, a trip to Oregon is recommended. I've been to Google's data center in The Dalles and to Amazon's facility in Boardman, though I observed them from the outside not inside. Flying into Portland, driving along the Columbia River Valley where the wind turbines are spinning, and a side trip to Mt. Hood make for a great few days. I also rafted and fished in the Deschutes River. Land, sun, water, and wind -- you can see why they build data centers there.
fmann-craik950
50%
50%
fmann-craik950,
User Rank: Apprentice
3/6/2013 | 5:47:31 PM
re: Facebook's Data Center: Where Likes Live
Interesting article that gives Facebook an environmentally responsible "face." Thanks for sharing this in-depth info Charles.
D. Henschen
50%
50%
D. Henschen,
User Rank: Apprentice
3/6/2013 | 5:26:47 PM
re: Facebook's Data Center: Where Likes Live
This account really demonstrates where the rubber of our digital lives meets the road. Great to see so many green initiatives coming together. It's pretty obvious that environmentalism and good old-fashioned capitalism -- the drive for efficiency and competitive advantage -- go hand in hand. If Facebook is as willing to share its data center breakthroughs as it has been its Open Compute Project server breakthroughs, the entire industry will benefit.
Slideshows
Cartoon
Audio Interviews
Archived Audio Interviews
Jeremy Schulman, founder of Schprockits, a network automation startup operating in stealth mode, joins us to explore whether networking professionals all need to learn programming in order to remain employed.
White Papers
Register for Network Computing Newsletters
Current Issue
Research: 2014 State of the Data Center
Research: 2014 State of the Data Center
Our latest survey shows growing demand, fixed budgets, and good reason why resellers and vendors must fight to remain relevant. One thing's for sure: The data center is poised for a wild ride, and no one wants to be left behind.
Video
Twitter Feed