Data centers

09:27 AM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

5 Data Center Trends For 2013

Energy efficiency will continue to be a major focus of data center operations over the coming year, but that's not all we'll see.

2. Natural Gas Gains Steam

Beyond reducing the amount consumed, there's another energy issue looming in data center operations. There's usually little debate over what type of energy to use: electricity purchased off the grid is nearly everyone's first choice.

But the U.S. is currently experiencing a glut of natural gas, drilled from underground shale formations in South Dakota, Pennsylvania and the Appalachian states. Gas is at its lowest prices in years, due to the oversupply. And a few companies are poised to take advantage of it through use of onsite generators burning natural gas to supply all their power needs.

Datagryd is one of them, in its 240,000-square-foot data center at 60 Hudson Street in Midtown Manhattan. CEO Peter Feldman said in an interview that not only can he generate electricity with the nation's cleanest fuel, but his firm has designed a cogeneration facility where the hot gases in the generators' exhaust drive a cooling system for his data center. When he's got surplus power, it can be sold to New York City's Con Edison utility.

As California and other states take up the possibility of allowing drilling for natural gas, Datagryd's example may become a pattern in future large data center operations. The ability to generate the electricity needed from fuel delivered by underground pipeline had its advantages when Hurricane Sandy hit New York and New Jersey. While generators at other nearby data centers ran out of fuel and sputtered to a stop, Datagryd continued delivering compute services to its customers; it didn't need to get diesel trucks over closed bridges or through blocked tunnels. It continued functioning throughout the crisis, Feldman said.

3. Rise Of The Port-A-Data-Center

Speaking of Hurricane Sandy, another alternative type of data center located in Dulles, Va., bore the brunt of Sandy's impact without going down. It was AOL's outdoor micro data center that stands in a module roughly equal to a Port-a-Potty. The modules are managed remotely, so if storm winds knocked the structure down, there was no one onsite to set things right.

The unit sits on a concrete slab and contains a weather-tight rack of servers, storage and switching. Power is plugged into the module, the network connected and water service installed, since hot air off the servers is used to warm water in a heat exchanger. The water is then cooled outside the module by ambient air. The water is in a closed-loop piping system, and its temperature can rise to as high as 85 degrees without the cooling system failing to do its job.

The design of the system brings the power source close to the servers that are going to use it. The water- and fan-driven cooling system requires little energy compared to air conditioning units. And there's no need for lights or electrical locking mechanisms, as a glasshouse data center typically has. The combination gives the micro data center a potential PUE rating of 1.1, according to spokesmen from Elliptical Mobile, which produces the units.

"We experienced no hardware issues or alerts from our network operations center, nor did we find any issues with the unit leaking," said Scot Killian, senior technology director of data center services at AOL in a report by Data Center Knowledge on Nov. 28.

AST Modular is another producer of micro data centers. These modules may soon start to serve as distributed units of an enterprise's data center, placed in branch offices, distributed manufacturing or locations serving clusters of small businesses. AOL is in an experimental phase with its module and hasn't stated how it plans to make long-term use of them in its business.

4. DTrace Makes Data Centers More Resilient

Data centers of the future will have many more built-in self-monitoring and self-healing features. In 2013, that means DTrace, an instrumentation and process-triggered probe into how well a particular operating system and application work together.

DTrace is a feature that first came out in Sun Microsystems' Solaris in 2005, then gradually became available in FreeBSD Unix, Apple Mac OS X and Linux. The Joyent public cloud system makes extensive use of it to guarantee performance and uptime through its SmartOS operating system, based on open source Illumos, (a variant of Solaris).

Developers and skilled operators can isolate any running process that they wish and order a snapshot of the CPU, memory and I/O it uses, along with other characteristics, through a DTrace script. The probe is triggered by the execution of the targeted process.

Twitter has used DTrace to identify and eliminate a Ruby on Rails process in Twitter's systems that was slowing operations by generating back traces in Twitter systems that were hundreds of frames long, tying up large amounts of compute power without producing beneficial results.

Jason Hoffman, CTO of Joyent, said in a recent interview that effective use of DTrace yields large amounts of data that can be analyzed to determine what goes wrong, when it goes wrong and how to counteract it. The Joyent staff is building tools to work with DTrace in this fashion and provide a more resilient cloud data center, he said.

5. New Renewable Energy Forms

The previously mentioned New York Times story panning the rapid build-out of data centers didn't consider a new possibility. The data centers of the future will not only consume less power per unit of computing done; they will also in some cases be built next to a self-renewing source of local energy -- yielding a net zero of carbon fuel consumption. There are many prime candidates for renewable power generation around the world from wind, solar, hydro-electric or geo-thermal, but most are too remote to become cost-effective suppliers to power grids. It's simply too expensive to build a transmission line to carry the amount of current that they can generate to the grid.

But data centers built near such sources could consume the power by bringing the data they're working with to the site, instead of bringing power to the data. Such a site would require only a few underground fiber optic cables to carry the I/O of the computer operations to customers. Facebook found Prineville, Ore., a suitable site for large data center operations; Google and cloud service providers are believed to be building early models of data centers relying on self-renewing energy sources. Microsoft is experimenting with a data center fueled by biogas from a wastewater treatment facility. Some enterprises may experiment with micro-data centers placed near a self-renewing energy source, such as a fast-flowing stream, sun-baked field or wind site.

Swift-flowing streams from glacier melt in Greenland and melting snows in Scandinavia have been chosen as sites for building prototypes of data centers built at self-renewing energy locations.

Previous
2 of 2
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
aperala
50%
50%
aperala,
User Rank: Apprentice
3/24/2014 | 2:17:06 PM
Cloud Storage Providers
Great article. The servers for cloud storage have more strenuous demands than typical in-house servers and have to built more exacting standards. Companies like Savage I/O have popped up that specialize in hardware technology specifically for the cloud.
Cartoon
Hot Topics
7
VMware NSX Banks On Security
Marcia Savage, Managing Editor, Network Computing,  8/28/2014
5
How To Survive In Networking
Susan Fogarty, Editor in Chief,  8/28/2014
White Papers
Register for Network Computing Newsletters
Current Issue
2014 Private Cloud Survey
2014 Private Cloud Survey
Respondents are on a roll: 53% brought their private clouds from concept to production in less than one year, and 60% ­extend their clouds across multiple datacenters. But expertise is scarce, with 51% saying acquiring skilled employees is a roadblock.
Video
Slideshows
Twitter Feed