Kurt Marko

Contributing Editor


Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

Vendor NewsFeed

More Vendor NewsFeed »

See more from this blogger

Data Center Efficiency Plateaus

The latest Uptime Institute Data Center Industry Survey reveals some interesting trends, including a reduced focus on data center efficiency.

According to Uptime's survey, which queried 1,000 data center facilities operators, IT managers and senior executives from around the globe, efficiency as measured by median responses (so we're not talking about behemoths like Amazon, Google or Facebook), has plateaued and is no longer considered an urgent priority. Only half of North American respondents said they considered efficiency to be very important.

More Insights

Webcasts

More >>

White Papers

More >>

Reports

More >>

Uptime's data shows that initial gains in PUE, the standard metric for data center efficiency, which improved dramatically from 2007 to 2011, were largely the result of easy fixes like properly isolating hot and cold aisles, installing blanking panels in unused rack segments and upgrading old power distribution equipment to more efficient models. Now, however, improvements are much harder and more costly to come by. Thus, most operators consider a 1.65 PUE (the average in this year's survey) good enough, even as the mega colocation centers and cloud operators race to see who can edge closer to the ideal level of 1.0.

An easy fix can be borrowed from every homeowner trying to cut their summer electric bill: just crank up the thermostat. Only 7% of respondents operate data centers at temperatures above 75 degrees, even though ASHRAE, the professional society of HVAC engineers, says 80 degrees is a reasonable upper bound.

Another drag on efficiency is the prevalence of zombie servers, the survey indicated. "According to Uptime Institute’s estimates based on industry experience, around 20% of servers in data centers today are obsolete, outdated or unused," the report said. Uptime estimates that for every 1U zombie unplugged, operators save about $2,500 a year in energy, OS licenses and hardware maintenance.

[Uptime's study also indicated that data centers are becoming the domain of service providers as smaller enterprises increasingly outsource their data center operations. Read Kurt Marko's analysis in "Data Center Study: The Big Get Bigger."]

In addition to data center efficiency trends, the Uptime report highlights three data center technologies that are poised for explosive growth: adoption of public cloud services, data center infrastructure management (DCIM) and prefab modular data centers. While we'd agree on the first two, we have our doubts about modulars.

Public cloud growth is a no-brainer as nearly every survey, including ours, shows that enterprise resistance, fueled by a combination of protectionism, security and performance FUD and immature management software, is rapidly crumbling. Only 20% of the respondents to InformationWeek's State of Cloud Computing Survey have no plans to use a cloud service provider. Uptime finds global cloud adoption still rather low at 28%, but large companies are twice as likely as smaller ones (as defined by the total number of operated servers) to deploy public cloud services.

In contrast, private cloud seems to have hit a brick wall, with deployment actually falling in Uptime's survey. It's either harder than people think or smaller companies figure why bother re-architecting for a private cloud when they can rent a ready-made one at AWS or Rackspace.

According to the survey, 38% of respondents use DCIM software, which Uptime defines as a facility-wide system that catalogs assets, collects usage statistics and records operational status. Using some homegrown spreadsheets and open source monitoring tools doesn't qualify, although we would argue that something like Nagios is a long way from DIY Perl scripting and includes many DCIM features. Uptime's number seems high, but the respondent demographics skew large, with 82% managing more than one site and 42% in the business of data center hosting as a colocation or cloud service provider.

Of course, only large operators can justify the cost of DCIM tools; looking at just the small companies in Uptime's sample, 72% report spending over $100,000 on DCIM tools, while 17% of the largest ones spend $400,000 or more.

I take issue with Uptime's prediction regarding prefab modular data centers. Those semi-truck shipping containers made into a tightly packed computer rooms are a clever idea that's time has come and gone. When first introduced more than five years ago, modulars offered superior energy and space efficiency to conventional facilities, but with some significant downsides. First, you needed to redesign data center facilities to look more like a mobile home park -- with concrete pads and utility drops -- than a self-contained warehouse. Secondly, with such tight quarters, if -- or make that when -- a modular's cooling system ever so much as hiccups, the temperature spike could roast everything within minutes.

According to Uptime's own data, modular adoption is tepid with only 8% of data center operators having deployed and another 8% considering them. The majority of respondents (53%), have no interest. Even among large operators, only 15% have modular deployments.

Kurt Marko is an IT pro with broad experience, from chip design to IT systems.


Related Reading


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Public Cloud Reports

Research and Reports

August 2013
Network Computing: August 2013



TechWeb Careers