DATA CENTERS

  • 09/28/2017
    7:00 AM
    Matt Stansberry, Senior Director of Content and Publications, Uptime Institute
  • Matt Stansberry, Senior Director of Content and Publications, Uptime Institute
  • Commentary
  • Connect Directly
  • Rating: 
    0 votes
    +
    Vote up!
    -
    Vote down!

Top 3 Data Center Myths Debunked

Uptime Institute study provides a reality check about the state of enterprise data centers.

Each spring, Uptime Institute conducts its Annual Data Center Industry Survey of more than 1,000 data center managers and executives about their plans, challenges, and strategies for the coming year. The data from this year’s survey demonstrates that many of the tech community’s assumptions aren’t necessarily accurate. Let’s look at the data to debunk the top three data center myths.

Myth #1: Public cloud is going to kill the enterprise data center

According to the survey, 67% of respondents reported workloads that would have previously resided in their enterprise data centers have gone to the cloud. But the data also reveals that while the IT organizations are moving some of their workloads to the cloud, the percentage of workloads residing in enterprise-owned/operated data centers has remained stable at 65% since 2014.

According to the last four years of survey data, respondents consistently report two thirds of their IT assets are based in their own sites. There is a lot of sunk investment in enterprise IT that isn’t going anywhere.

datacenter.jpg

data center
Caption Text: 

(Image: Timofeev Vladimir/Shutterstock)

 

That said, public cloud adoption and improved hardware performance are slowing down data center growth for some companies, and that’s not a bad thing. It’s smart use of resources and increased efficiency.

There’s a lot of uncertainty on how IT workloads are going to the cloud and how fast, so execs aren’t approving $50 million data center projects when people are saying Amazon or Microsoft could be running everything in five years. Unless you're meeting pent-up demand in a developing economy, or working for a hyperscale company like Facebook, you probably aren’t getting $50 million next year for a new site, even if your data center is starting to show its age.

Which brings us to our next overblown concern.

Myth #2: Our legacy data center can’t handle new workloads!

What we see in the survey data is that about half of enterprise IT organizations are upgrading existing facility infrastructure. About half the respondents report dealing with an aging chiller system or building management system (BMS).

Of course it’s going to be challenging to do system upgrades to live sites. But if this is your organization’s directive, they’ve accepted those risks. Also, you are prepared. Most of you have spent your entire career in the mission-critical space, preparing and planning for outages, for things to go wrong.

If you need to replace your chiller systems, maybe it’s a good time to install that economizer you had been considering. Maybe the BMS upgrade would be a good excuse to get that data center infrastructure management (DCIM) software project started.

An operational, utilized, and depreciated facility asset is good for the balance sheet. Moreover, it’s not as if the data center you built 10 or 15 years ago can’t hold up. Uptime Institute has been studying server rack density for years, and the consistent finding is that high-density computing is not an issue for most companies. You may have pockets of high-density computing, but the vast majority of sites are operating at less than 6kW per rack.

Myth #3: But we’ve got to keep up with what the hyperscale companies are doing!

Every data center conference for the past 10 years has featured large internet companies talking about all the very cool stuff they’re doing. Guess, what? It doesn’t apply to you! Your company doesn’t need to install a data center on a barge in San Francisco Bay, integrate a windfarm, or to deploy something that looks like a modern art installation.

One of the most-cited hyperscale trends of the last few years had been Facebook’s Open Compute Project, a way to rethink the integration and design of the entire IT infrastructure stack, from the data center to the processor. It’s innovative, disruptive, and incredibly cool. But only 2% of data center respondents said they’ve deployed it. Over 40% had never even heard of it.

Rather than focusing on those issues, data center execs need to pay attention to chronic, critical issues facing the profession. Focus on training and maintenance to extend the life of your site, improve your employee retention, and be more responsive to your company’s own private cloud strategy. Become more effective at tracking the cost and benefits of on-premises enterprise IT to budget holders on the business side.

Lastly, join the business conversation for off-premises compute options. Seventy percent of respondents reported that their organizations’ processes for evaluating cloud, colocation, and on- premises computing options needed improvement. In fact, 15% described their evaluation process as incoherent.

Matt Stansberry is the Uptime Institute senior director of content & publications and program director for Uptime Institute Symposium. He has researched the convergence of technology, facility management, and energy issues in the data center since 2003. Stansberry is also responsible for the annual data center survey, and develops the agenda for Uptime Institute industry events including Symposium and Charrette.

 

 


Log in or Register to post comments