Storage

10:42 AM
Howard Marks
Howard Marks
Commentary
50%
50%
Repost This

Object Storage's Path to the Enterprise

Object storage is a smart fit for enterprise use cases such as backup, but the need to write to custom APIs impedes adoption.

I spent several days in Miami recently at the Object Storage Summit with storage industry analysts and vendors, including Cleversafe, Scality, Data Direct Networks (DDN), Nexsan and Quantum. We spent a lot of time talking not just about how the various object storage systems worked, but also about what the object storage vendors have to do to move object storage further into the mainstream. The use cases are there, but vendors must make application integration easier for enterprise customers.

Like most emerging technologies, object storage found initial acceptance in a select set of vertical markets, such as Web application vendors, cloud service providers, high-performance computing, and media and entertainment. These organizations have lots of files that get created (think Shutterfly, for example), and rather than modify those files in place, their workflows maintain each version of each object to allow for different methods of reuse.

I was a bit more surprised at the level of success the object storage vendors were having in the intelligence community. A couple of the vendors spoke (in generalities of course) about how the data collected from keyhole satellites and Predator drones is stored and processed on object platforms.

Object storage has been less successful in the commercial space, which is a shame. When I was teaching backup seminars last year, I would regularly get users complaining that incremental backups of their NAS systems took days to complete. It took that long to walk their file system and figure out which of the millions of files changed, and therefore needed to be backed up, regardless of how much new data there was.

If those users could find a way to migrate old, stale files off the production server to an object storage system, they'd dramatically speed up their nightly incremental backups and reduce the size of their weekly full backups by 60% to 90%.

The best part is that the object storage system itself never needs to be backed up, which can save the organization a bundle in opex. Object systems use replication or advanced dispersal coding to protect the data against hardware or site failures. Object storage systems also create a new object every time a user modifies a file, keeping the old version around as long as the organization's retention policy requires, so the object store doesn't need backups to protect our data from the users, either.

A major factor limiting object storage's acceptance in the corporate market is that each object storage vendor has its own SOAP or REST-based API for getting data in and out of the system. This means companies and ISPs need to customize their applications for each storage platform.

One interesting development is that vendors are adding support for the Amazon S3 API in addition to their native API. For object storage to take off in the corporate market, there has to be a standard interface for application vendors to write to.

DDN takes an API-agnostic approach on its WOS (Web Object System); it supports its native high-performance API as well as Amazon S3's and CDMI's object APIs. The company also integrates with clustered file systems such as GPFS and Lustre, which are common in the HPC world, and Hadoop's HDFS, which provides the persistence that big data file systems have lacked.

Object storage is the solution for large organizations drowning in tens or hundreds of petabytes of unstructured data. If vendors can make application integration easier, the enterprise market may open up.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Tim Wessels
50%
50%
Tim Wessels,
User Rank: Apprentice
1/7/2013 | 5:09:30 PM
re: Object Storage's Path to the Enterprise
The Object Storage Summit was basically an invitation only event for the media and bloggers to get them up-to-speed on object storage technology. The vendor that is storing exabytes of data for the NSA underground at Ft. Meade is CleverSafe. Until CDMI is widely supported, AWS S3 API compatibility is the defacto standard for object storage. Scality calls theirs the RS2 connector, Basho Riak CS offers S3 API compatibility while Cloudian explains they have 100% fidelity with the AWS S3 API. Both Riak CS and Cloudian can also function as secondary object storage for OpenStack and CloudStack (CloudPlatform). However, it is the third party vendors like TwinStrata (iSCSI Cloud Array) and Riverbed (Whitewater) that offer users the ability to de-duplicate and cloud store data as well as disk cache frequently used data locally while keeping the rest in cloud object storage. Software vendors like CloudBerry Lab and Gladinet also support the ability to backup and sync data with AWS S3-compatible object storage.
Inform-üZen
50%
50%
Inform-üZen,
User Rank: Apprentice
12/14/2012 | 4:03:50 PM
re: Object Storage's Path to the Enterprise
It is great to see the level of interest and awareness in the storage market around object technology lately. Seemed like there was just a few of us evangelizing the technology over the past several years. There is tremendous potential for it well beyond the "big bucket" to store stuff where we've seen the early adopters thus far that have massive demand for capacity. However, the informational value that object enables through metadata (as Larry identified) is being realized beyond the intelligence community and one of the more intriguing areas of innovation. Associations that can be made to link objects for mission analysis is the same being used for digital libraries, law enforcement, satellite imaging, and the potential in the commercial sector is significant.

informazen.wordpress.com
Larry Freeman
50%
50%
Larry Freeman,
User Rank: Apprentice
12/7/2012 | 3:34:37 PM
re: Object Storage's Path to the Enterprise
Nice overview Howard. I've been thinking lately that object storage could also be used to help increase the value of data. For instance, if a data object is sensor-generated, and all data objects from this type of sensor are grouped together, analytics could be run on the object group to spot trends and anomalies. Object storage would make this easier by classifying the source of the data object and tagging in metadata. Medical research comes to mind...

Just a thought

Larry Freeman
NetApp Storage Technologist
More Blogs from Commentary
Edge Devices Are The Brains Of The Network
In any type of network, the edge is where all the action takes place. Think of the edge as the brains of the network, while the core is just the dumb muscle.
SDN: Waiting For The Trickle-Down Effect
Like server virtualization and 10 Gigabit Ethernet, SDN will eventually become a technology that small and midsized enterprises can use. But it's going to require some new packaging.
IT Certification Exam Success In 4 Steps
There are no shortcuts to obtaining passing scores, but focusing on key fundamentals of proper study and preparation will help you master the art of certification.
VMware's VSAN Benchmarks: Under The Hood
VMware touted flashy numbers in recently published performance benchmarks, but a closer examination of its VSAN testing shows why customers shouldn't expect the same results with their real-world applications.
Building an Information Security Policy Part 4: Addresses and Identifiers
Proper traffic identification through techniques such as IP addressing and VLANs are the foundation of a secure network.
Hot Topics
White Papers
Register for Network Computing Newsletters
Cartoon
Current Issue
Video
Slideshows
Twitter Feed