Utility Computing: Have You Got Religion? update from November 2003

At its most hyped, utility computing is a simple idea. But is it really "the way"?

November 30, 2003

12 Min Read
Network Computing logo

At its most hyped, utility computing is a simple idea: enterprise IT services as reliable as electricity. Just plug in, use what you need, then pay the bill at the end of the month. It's an especially welcome notion to corporations that have spent mercilessly on IT only to find themselves wondering what, exactly, they got for their money. After years of a flat economy, fragile security and questions about whether IT ever really delivers the benefits it promises, many executives are ready for an IT epiphany. But is utility computing the way? That's hard to say when the industry cannot even agree on the term's definition. Although some major outsourcing deals have been positioned as utility computing in action--IBM's takeover of much of American Express' IT department is one example--this is the least likely form utility computing will take for most enterprises. Rather, most businesses will find strategic value in retaining ownership of their IT resources. For them, utility computing means rethinking how IT gets its job done--if IT needs to become more responsive and cost-effective, it will do so by holistically managing its systems and designing them from the ground up to be easier to run and fundamentally more flexible.

Definitions aside, the vendor and analyst communities are on board. Hewlett-Packard sees utility computing as a $3 billion market by 2004, growing to $18 billion by 2006, with professional services accounting for 30 percent to 35 percent of that total. IBM has earmarked $10 billion in research and development, acquisition, and marketing funds for its utility-computing initiative. On the analyst side, IDC says that 35 percent of all servers sold in the United States this year will be blades, with the market swelling to $6 billion by 2007.

Those rosy numbers would seem to portend a revolution, so we set out to determine just where the market is headed, hoping to separate reality from hype. What we found is that utility computing holds great promise and in fact represents the maturing of IT. Just as sales, manufacturing, R&D and other corporate disciplines have grown up, now it's IT's turn. It's not a matter of whether IT will change, but when.

Evolution vs. The Big Bang

Before we drill down into definitions, viewpoints and timetables, it's worth noting that we are skeptical as to whether a revolution is coming, and judging from our reader poll, so are you. Vendors have heralded more than one overnight transformation that never came to fruition. Why? Because Darwin had the right idea.Our industry, like most, evolves, and with good reason: You, the architects and ultimate buyers of information technology, won't let things move at a breakneck pace. It's far too risky. With the possible exception of the dot-com revolution--an experience we'd rather not repeat--IT decision-makers have dictated that real, meaningful, business-affecting change happens over the span of a decade, more or less. First, there was the PC revolution, from the mid-'80s to the mid-'90s; then the networking revolution of the early '90s; then the client-server revolution of the last 10 years; and most recently the Internet revolution. All were important turning points for IT, but each took the better part of a decade to move from first implementations to widespread business reliance.

It was this evolution of information technologies that led to the dawn of utility computing. Until three years ago, the executive suite's conviction that IT spending was critical to business success led to a sometimes reckless disregard for actual cost--so much so that IT still accounts for more than half of the typical corporate capital budget, not to mention a sizable human-resource cost.

In the last three years, a languishing economy and flat revenue growth have made management downright zealous about scrutinizing expenses. And when a big cheese has asked, "How much bang did I get for my IT bucks?" the answers have been less than satisfying.

Part of that dissatisfaction is simply a matter of nomenclature; IT managers still talk about uptime, transactions per second and initiatives taken on. The executive response has been: "Just tell me how I saved money or gained new business by doing all this IT stuff." Pity the poor CIO with no hard numbers to offer.

The watchword is: "Prove your value and lower your costs." Corporate profitability depends on it. In fact, our poll found that a whopping 57 percent of senior managers say that IT costs too much. Some 35 percent added that IT is too slow to respond to business needs, and 31 percent said IT doesn't measure its performance with usable business metrics. Remember that we polled IT decision-makers, not senior business-line executives--it's a fair bet that a survey of non-IT types would show even higher dissatisfaction.This, then, is the utility-computing mandate: Lower costs, make IT more responsive, and make it more accountable in its use of funds.

To take the pulse of the utility-computing trend, we spoke with a cross section of vendors about their definitions of, and plans for, utility computing. EMC, HP, IBM, Microsoft and Veritas Software claimed to embrace all three goals, albeit with varied emphasis and approaches. As you might expect, vendors as a whole are much more enthusiastic about helping to create a more responsive and accountable IT department than they are about reducing costs. When vendors did talk about reducing costs, they almost universally emphasized reducing human costs or using existing resources more effectively.

This makes sense. No vendor is going to develop a product strategy with the intent of shrinking its market share year over year. None of the five was even willing to confess that what it would lose in margin it would make up in volume. As far as vendors are concerned, if they are going to save you money, it's going to be for stuff they don't sell--read: human resources.

But this will be an uphill battle: Changing the human-resource equation means changing what IT does, and that means fighting inertia, which alone will quell any mass conversion. Universally, vendors cite the need to eliminate time-consuming, repetitive IT tasks. They understand that systems need to be more self-healing and that management tools, particularly those with a broad scope, need to become far more effective. Vendors also realize that many changes will be organizational; and to the extent you'll let them, they want to help make those changes, too.

In fact, if you get nothing else from this article, realize that the notion of utility computing is largely about changing the organizational dynamics of IT. Creating a more responsive organization means changing the organization; and that's hard. IT grew up the way it did for a reason. The disciplines within IT require training, experience and dedication; therefore, a relatively narrow skill set is highly valued, while big-picture thinking is usually dismissed or reserved for a very few "architects." Divide and conquer has been the way to IT success.That means networking administrators rarely talk to database gurus, who rarely talk to desktop support people, who rarely talk to storage administrators. When new initiatives, such as a sales-force or manufacturing application, come up that will make special demands on the network, database, desktops and storage, task forces usually are created to figure out what IT needs to do.

The networking representative comes up with a networking plan, the database admin figures what's needed on her end, and eventually a strategy is formed and the new application is fielded. Meanwhile, the representatives on this task force still have full-time jobs to do, so they figure out how to manage their time and meet both commitments.

This is the process that has spawned the unresponsive IT departments we hear so much grousing about today. And vendors have compounded the problem by giving various fiefdoms exactly what they've asked for--point tools that let IT groups effectively manage their unique resources without concern for, or knowledge of, the big picture.

Of course, on a day-to-day basis, these point-management tools are the ones that actually work and let IT experts do their jobs. However, with all the people inside IT using different tools, and different performance metrics to measure success, the likelihood of rolling it all up into a responsive organization that can measure performance based on business metrics is about equal to the proverbial snowball's chance in hell. The track records of broad-perspective management tools have been notoriously bad.

Note to vendors: If you really want to deliver on utility computing, create some truly useful broad-based management tools.There are essentially three approaches to this problem, and the three large players we spoke to for this article epitomize them.

The first line of thought is that things really have changed. System software, servers, storage and networking, along with management software, have matured to the point that a holistic, high-level management approach is possible. This is the view taken by HP, the company that brought us such concepts as pay-as-you-grow purchasing for server hardware and spiffy new data center management tools like WSMF (Web Services Management Framework) and VSE (Virtual Server Environment).

Unfortunately, HP talks a better game than it plays. Its OpenView management platform has been around forever and is too large, too costly and too ineffective. To HP's credit, it knows this (even if it doesn't acknowledge it) and is working hard to create better management tools.

The second approach agrees that things have indeed changed, but given that legacy systems and multivendor environments abound, a reorganization of the way IT works, along with new hardware and software, is needed. This "it takes a village" outlook is espoused by IBM, which touts extensive services offerings (under the IBM Global Services group) along with enhancements to its hardware and Tivoli management software. Tagline: a solution that goes beyond reshaping IT to reshaping corporate strategy.

IBM seems to have the most realistic approach; however, it also has the highest price tag. Remember, we started this pilgrimage with the goal of reducing costs, not spiking them.The third view says that if you really want to solve this problem, you must do it at a fundamental level. Applications, client software, operating systems, management software and systems hardware all need to be fully instrumented so they can provide information on their needs rather than forcing management tools and expert staff to take educated guesses.

As you've probably guessed, this is Microsoft's approach, which it now calls its Dynamic Systems Initiative. Under DSI, Microsoft has announced a number of across-the-board initiatives. The company is also signing up partners to participate in its vision. If you trust Microsoft and don't mind waiting the better part of five years, this story is a good one. The problem is, we've learned the hard way not to trust Microsoft to deliver on such sweeping initiatives.

And so here we are. No vendor has a perfect story, but each has something on paper: You don't get to be an HP, IBM or Microsoft without always having a plan.

Yet none of these visions is even close to that ideal of a utility service that's as simple to use and budget for as your electric.

That sort of utility computing is at least the better part of a decade away--if it ever happens. But we understand that many of you want to start on the road, so in the next few pages we delve deeper into the devilish details.Art Wittmann is a Network Computing contributing editor. He was previously the editor of Network Computing, and has also worked at the University of Wisconsin Computer-Aided Engineering Center as associate director. Send your comments on this article to him at [email protected].

If all the vendor noise about utility computing sounds to you like a bunch of traveling televangelists promising to heal your budget woes, place end users under your sway and raise the dead to boot, read on.

Utility computing is the industry's answer to CFO Lament No. 1: "IT costs too much, is unresponsive and is impossible to measure by the usual business metrics."

What utility computing definitely is not now--nor will it ever be--is a real utility service. Information technology can't be purchased like water. Period. Rather, utility computing represents the evolution of the IT industry.

In the next few months and years, look for vendors to deliver systems and software that are more flexible, more self-healing and more manageable. Servers and storage systems will continue on their trek toward modularity. Management software will do a better job of showing how resources are being used and will off-load some of the tedious, error-prone, time-consuming tasks that currently weigh down IT staffs.Unfortunately, the more heterogeneous your IT infrastructure, the harder these goals will be to achieve. On the systems side, Hewlett-Packard and IBM are showing the best understanding of this challenge. Software companies like Veritas Software also have a good story to tell. Vendors whose value proposition is tightly tying proprietary hardware to proprietary software will continue to be less attractive.

In the short run, efforts to reshape how IT resources are purchased should be viewed skeptically. Vendors are unwilling to shave more off bottom lines that have already been cut thin. And vendors can deliver only a piece of the puzzle. To really make IT more efficient and responsive, the IT organization must change. The tendency to prize unique skills must take a backseat to appreciating the big picture. Just as product design reinvented itself in the '80s, so must IT start producing systems and resources that are flexible, accountable and affordable. The idea isn't to turn your IT shop into a utility, but to get more utility out of what you're doing.

This process will take time and may, in the short run, cost more money. The trick is to keep one end goal constantly in mind--to organizationally evolve IT to be flexible, accountable and affordable.

When it comes to vendors' grand visions, a healthy dose of agnosticism is your best bet. Weeding through claims is complicated, and the equation changes drastically depending on the size and focus of your enterprise. Microsoft does well for smaller companies, and HP, IBM and Sun Microsystems do likewise for the big players--yet none is even close to having all the pieces of the utility-computing puzzle. Realize that today, adopting a strategy wholesale is a leap of faith.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox
More Insights