05:16 PM
David Hill
David Hill
Repost This

Tackling Information Infrastructure Complexity

Taking a software and data-driven approach can help reduce costs and maintain availability as IT becomes more complex. Here's a look at Sanbolic, one of the companies in this emerging area.

Sanbolic’s Melio is useful internally for IT organizations, but it can also be used in the public cloud. It's available as an Amazon Machine Image (AMI) via the Amazon Web Services Marketplace. Melio allows customers to cluster and scale-out their websites with high availability, a feature that has attracted developers. For example, Roadnet Technologies is offering a trucking fleet management product as a SaaS that uses Melio as its foundation.

On the storage side, Melio aggregates storage across Amazon EC2 instances while providing storage and data management services, such as RAID, remote replication and snapshots. On the server side, Melio’s AppCluster component provides clustering for Microsoft IIS servers that run on Amazon EC2. Spanning clusters across multiple AWS groups enables high availability. Dynamically expanding clusters when needed to support increased traffic gets rid of the need to replicate data. The rule of thumb is to keep data as close as possible to the application that needs it, and it is usually simpler to move applications than data.

Note that, in general, whatever Melio does for the public cloud it can also do for private clouds.

Not A Quick Fix

Take note of the fact that, theoretically, he who controls the type of data-driven software intelligence that Sanbolic offers controls the information infrastructure and commoditizes the underlying hardware. No wonder that Sanbolic and other relatively young software-based competitors have caught the attention and competitive response of major vendors, such as EMC with its ViPR solution.

Note also that involvement of large vendors is a “Good Housekeeping Seal of Approval” that validates the market. That means that more and more potential customers will start evaluating software-defined solutions. However, while that is to the benefit of Sanbolic and other software-only vendors, it's not likely to cause the large IT vendors to lose a lot of sleep.

While IT organizations move quickly in some areas, this kind of technology would be a major change if implemented totally. Switching costs (not only purchase costs, but also sunk costs), risk management, and planning and implementing is all measured in years, not months. Still, if software-only companies like Sanbolic play their cards right, they could do relatively well, considering their size.

Mesabi Musings

Something has to be done about the growing complexity of IT information infrastructures. Traditional strategies are unsustainable, especially in light of expected increased demands. One way out is to take a software-defined and data-focused approach.

While IT complexity can never be completely eliminated, reducing it to a more manageable level certainly can improve the cost structure among other benefits. The battle over software-defined and a data focus (which are part of the private, hybrid and public cloud discussion) should be an interesting one. Sanbolic is representative of software vendors that are trying to make inroads against established large software and hardware vendors. Whether they know it or not, enterprises have an investment in how this all shakes out.

Sanbolic is not a client of David Hill and the Mesabi Group.

2 of 2
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Apprentice
8/5/2013 | 3:28:13 PM
re: Tackling Information Infrastructure Complexity
Topics seem to pop up in bunches for me. There was a Bay Area Meetup last week where the topic was Big Data and SDN. The panel was VMWare, Citrix, Savvis, and a networking Incubator called The Fabric.

One of the things the panel talked a lot about was that all of this data was going to drive a much heavier reliance on math and algorithms. Doing something useful with the data is non-trivial, and the algorithms will likely play a larger role in intelligent workload distribution than they do today.

It's interesting because an early indicator in which vendors are best equipped to do this might not be traditional expertise but rather professional mathematicians. I thought the implications on skill sets was interesting.

Mike Bushong (@mbushong)
More Blogs from Commentary
Edge Devices Are The Brains Of The Network
In any type of network, the edge is where all the action takes place. Think of the edge as the brains of the network, while the core is just the dumb muscle.
SDN: Waiting For The Trickle-Down Effect
Like server virtualization and 10 Gigabit Ethernet, SDN will eventually become a technology that small and midsized enterprises can use. But it's going to require some new packaging.
IT Certification Exam Success In 4 Steps
There are no shortcuts to obtaining passing scores, but focusing on key fundamentals of proper study and preparation will help you master the art of certification.
VMware's VSAN Benchmarks: Under The Hood
VMware touted flashy numbers in recently published performance benchmarks, but a closer examination of its VSAN testing shows why customers shouldn't expect the same results with their real-world applications.
Building an Information Security Policy Part 4: Addresses and Identifiers
Proper traffic identification through techniques such as IP addressing and VLANs are the foundation of a secure network.
Hot Topics
White Papers
Register for Network Computing Newsletters
Current Issue
Twitter Feed