Data centers

02:56 AM
Connect Directly
RSS
E-Mail
50%
50%

Predictive Analytics Key To Unlocking Big Data's Value, Reducing Cloud Complexity

While adoption of predictive analytics systems is still relatively modest, the promise they hold--giving companies a quick and logical way to make use of unstructured or big data to provide preemptive warning of failure, among other things--is prompting vendors to embed the functionality in their larger management consoles and application performance management systems.

While adoption of predictive analytics systems is still relatively modest, the promise they hold--giving companies a quick and logical way to make use of unstructured or big data to provide pre-emptive warning of failure, among other things--is prompting vendors to embed the functionality in their larger management consoles and application performance management systems.

The average enterprise IT team has lived without these systems until now, observes Jake McTigue, IT manager for Carwild and author of the InformationWeek report, Strategy: Predictive Analytics for IT. What has changed, he writes, is "the push toward private clouds and service-oriented IT, combined with an unprecedented number of data sources. A highly virtualized infrastructure, big data and analytics go together." McTigue notes that collecting operational data has generally not been IT’s strong suit, but that this needs to change "because a cloud architecture--public, private or hybrid--brings complexity, which is the enemy of uptime. And this is at the core of why we believe predictive analytics will become a must-have for enterprise IT sooner rather than later."

Besides the ability to provide preemptive warning of failure, these systems also address issues around uptime, application availability and trouble-free scaling, he says. Tools such as Hewlett-Packard’s Service Health Analyzer, IBM’s SPSS-powered Tivoli product, and systems from vendors like Netuitive, CA and EMC depend most on lots of big data, says McTigue. "Big, clean, fast-moving flows of real-time information are the lifeblood of predictive analytics systems."

Previously, predictive analytics systems were available in the form of SDKs only for IT infrastructure, he says. Now, vendors are now designing, building and marketing predictive analytics systems specifically for IT infrastructure use, "which takes a huge bite out of the time to deployment, development effort and cost,’" McTigue says. "The new canned systems start running pretty much out of the box, which is why they’re suddenly a much better value proposition than developing your own system used to be."

Predictive analytics is still a new concept for IT, though, and McTigue says there are latency issues in getting the word out. One of the downsides is that the cost of these systems is high, and they require "good metric streams from all levels of the infrastructure," meaning that only large organizations are ready for immediate adoption and that, like any new technology, there is a learning curve involved. "The primary barrier to adoption is having metric streams that you can trust available to the system from the entire infrastructure," he says, as well as integrating the system into day-to-day business processes so that IT is actually acting on the predictive capabilities.

McTigue points out that in an InformationWeek Virtualization Management survey, respondents were asked to rate a dozen features based on importance, and that high availability was the No. 1 criterion. Predictive analytics can achieve that by helping overcome complexity, he writes. But McTigue also warns that organizations need to get a handle on this soon, since data is climbing exponentially and complexity "will only increase as we adopt ever more advanced converged architectures."

Automation is the ultimate goal of predictive analytics systems, since "all the predictive data in the world does us no good if we can’t act on it, fast." Yet, McTigue says the survey also found that fewer than half (40%) of respondents characterize their use of data center automation tools as significant or extensive. At the same time, 27% report network problems every day or week, and 32% say they have server problem just as often. He cites trust issues as another barrier to adoption.

The technology is in its infancy, he says, but the level of development in the last year alone proves that the technology is here to stay. And that makes sense, says McTigue, because as companies continue to launch more data-intensive, Internet-only businesses and business operations, "they’ll need to monitor IT closely since it’s a huge ingredient in both profitability and viability."

Learn more about Strategy: Using Google to Find Vulnerabilities by subscribing to Network Computing Pro Reports (free, registration required).

Comment  | 
Print  | 
More Insights
Hot Topics
6
8 Gotchas Of Technology Contracting
Craig Auge, Partner, Vorys,  7/17/2014
5
10 Handy WiFi Troubleshooting Tools
Ericka Chickowski, Contributing Writer, Dark Reading,  7/22/2014
5
Guide: The Open Compute Project and Your Data Center
James M. Connolly, Editor in Chief, The Enterprise Cloud Site,  7/21/2014
White Papers
Register for Network Computing Newsletters
Cartoon
Current Issue
2014 Private Cloud Survey
2014 Private Cloud Survey
Respondents are on a roll: 53% brought their private clouds from concept to production in less than one year, and 60% ­extend their clouds across multiple datacenters. But expertise is scarce, with 51% saying acquiring skilled employees is a roadblock.
Video
Slideshows
Twitter Feed