• 12/04/2013
    1:22 PM
  • Rating: 
    0 votes
    Vote up!
    Vote down!

To Crack The Cloud Lock-In Problem, Think Standards

Emerging specs for hybrid clouds and converged datacenters promise to break vendors' proprietary hold.
Download the entire December 2013 InformationWeek special issue on hybrid cloud standards, distributed in an all-digital format (registration required).

The word "standards" evokes images of combative committees taking six months to decide where to hold a meeting and then letting dominant industry players hold down superior technology.

Vendors blamed delays with the 802.11n spec for their going rogue with proprietary implementations that often didn't work together; eventually the Wi-Fi Alliance had to expend money and effort on a certification program. And while the Open Networking Foundation begs to differ, Cisco CEO John Chambers recently asserted that advanced networking "can't be done in software." His unstated pitch: "Why wait for messy SDN standards to gel? Just cut a check for Cisco ONE."

Going with proprietary technology is tempting. But standards remain important, especially in the era of convergence and cloud, with its "just make it work" culture. In particular, standards are critical for shuttling workloads between on-premises and multitenant systems. A well-considered design that uses stock components, protocols, and interfaces wherever possible will improve efficiency, lower costs, and maximize flexibility and scalability.

Crack the code
In some ways, the cloud is becoming a standard in its own right. Accenture forecasts that cloud services will grow at seven times the rate of in-house IT between now and 2016, at which time 46% of all IT spending will be cloud-related. A KPMG survey finds that for 14 major enterprise functions, from email and office productivity to HR and supply chain, 60% to 90% of respondents will be using cloud services within 18 months.

While the public cloud garners the most attention, enterprises will deliver most IT services from private and hybrid clouds for the foreseeable future. In our 2012 InformationWeek Private Cloud Survey, 21% of respondents had built private clouds, with an additional 30% starting projects. Our 2014 survey, which will debut later this month, shows 47% of respondents with private clouds in production for some or most of their applications and 30% testing or starting private cloud projects. Most of those shops want to integrate private and public cloud services in a hybrid architecture -- in fact, just 19% of those using or planning to use the public cloud aren't going to supplement that usage with a private cloud setup.

Our take: Most enterprise IT teams will be tasked with integrating cloud services with on-premises infrastructure and applications. Sounds to us like a call for standardization.

Public-private chasm
Standards must be in play both within the datacenter and at the interface between private and public services. For a converged private cloud architecture, borrow from the service provider playbook and use a common set of server and storage components on a converged Ethernet backbone. That's the only way to lower costs and improve versatility, by letting workloads migrate (often automatically) from system to system without worrying about configuration or hardware compatibility.

The notion of using standardized building blocks that can be quickly deployed is the basis of the "Superpods" that is working on with Hewlett-Packard. Essentially, if a customer doesn't want to share application infrastructure, Salesforce can plop down a standard set of hardware, spin up its software stack, and deliver a service identical to its public-cloud software-as-a-service suite.

As for the interfaces between public and private clouds, standards come into play for workload orchestration, application image packaging, infrastructure management, and user authentication. Instead of seeming like separate services, where workloads and resources exist in their own bubbles, standard interfaces using corporate identities help shuttle applications among public and private clouds. IT can also manage everything from a single console.

All of this should be possible without locking your company into a proprietary stack, but be careful. So far, cloud technology advances have outpaced the industry's ability to craft comprehensive standards for interoperability, management, auditing, and data migration.

To read the rest of this story,
download this InformationWeek December 2013 special issue on hybrid cloud standards.


Why should vendors play along?

It's clear why IT wants standards, and why hardware makers and established vendors like Microsoft tend to play (at least somewhat) nicely in the standards sandbox, even if they'd prefer not to.

But why should Amazon or Google want to make it possible for customers to easily and quickly shuttle workloads in and out of their clouds? Playing devil's advocate, that ability will put downward pressure on prices, like a cable customer who jumps between Verizon and Comcast every year or two to get new-customer deals, except at hyperspeed.

Why not be upfront and pursue lockin, if you're big enough to get away with it?

Re: Why should vendors play along?

The problem is that they're still not big enough to get away with it. Sure, Amazon dominates the cloud market, but it's not an oligopoly. There are dozens of large alternatives and hundreds of smaller ones. Verizon, HP, Rackspace, CenturyLink, et. al. are more than happy to siphon off AWS and GCE customers by being more customer friendly.


Vendors don't play along, until they have to.

Why should vendors play along? They don't voluntarily. Dominant vendors don't adopt a standard as long as they have any hope of setting a defacto standard and getting the world to adopt its way of doing things. Amazon has almost done so, but let's not encourage more bad behavior.

Joe Weinman wrote knowledgably in this space about the Internet of Clouds, as well he might. He's the chair of the IEEE's Cloud Interoperability standard body.

Re: Vendors don't play along, until they have to.

Yes, short-sighted vendors won't play along but experienced vendors know that lock-in via their own de facto standard usually doesn't pay off in the long run. If the technology in question is significant/important, eventually their competitors will get together and standardize an alternative. If this alternative suceeds, their investment in their proprietary solution becomes a liability.

Also standards help to create markets; IBM made more money selling IP gear than they ever made selling SNA equipment even though the later was completely proprietary to IBM. To the extent that the lack of interoperability and portability is holding back the cloud computing market, vendors might be better off in abandoing their attempts to lock-in their customers in exchange for a smaller share of a much larger pie.

Standards bodies: irrelevant?

I agree with everyone who's pointing out that vendors aren't going to play along, and I would add a question about the relevance of "standards bodies" today, in an age where technology is moving so much faster than standards bodies can.  Can anyone actually name a standard created in the past five years that originated with a standards body?  When I think of cloud standards, I think of things like OAuth (Twitter) and SCIM (Google/Salesforce)... and I look at standards like CAMP and think: ain't going to happen.

Re: Standards bodies: irrelevant?

Yes, technology moves faster than standards bodies, but no one who knows what they are doing attempts to standardize an area that is changing rapidly. CAMP, which you mentioned, is an attempt to standardize the management of applications on a PaaS platform. There hasn't been any significant technological changes in this area for at least 5 years now - which is precisely why Rackspace, Red Hat, Oracle, et. al. decided to work on this particular problem.

As for speed, I would be surprised if CAMP weren't finished by early 2014 (February or March).