Careers & Certifications

07:40 PM
Commentary
Commentary
Commentary
50%
50%

De-Dupe Fragmentation

For something that was supposed to become 'just a feature,' people sure do get passionate about data de-duplication

Welcome to the Hot Zone, our new blog on Byte and Switch where we will discuss the storage and virtualization trends that are affecting the data center. I'll use the first several entries to finish up the de-duplication topic. Well, maybe not finish -- how about continue? I'm sure we will revisit the topic from time to time.

For something that was supposed to become "just a feature" people sure do get passionate about de-duplication.

One of the areas I wanted to address was source side de-dupe, which a few posters have proclaimed the be-all and end-all of de-duplication. But before we handle that firecracker let's discuss the fragmentation of the de-dupe market.

First, de-duplication is showing up everywhere: primary storage, backup storage, archive storage, and even the wide-area network. Of course, there are different types of de-dupe implementations on each of these platforms, and each of the vendors thinks its solution is the best.

The guys that OEM technology seem to be at a disadvantage as the de-duplication use case expands. They end up with a de-dupe technology for primary storage, a different one for NAS storage, another one for backup (maybe two? or three?), and one for archives. This has to be confusing. Their advantage is they can move into this market faster -- but at what cost, a totally defragmented data reduction strategy?

Previous
1 of 3
Next
Comment  | 
Print  | 
More Insights
Cartoon
Slideshows
Audio Interviews
Archived Audio Interviews
Jeremy Schulman, founder of Schprockits, a network automation startup operating in stealth mode, joins us to explore whether networking professionals all need to learn programming in order to remain employed.
White Papers
Register for Network Computing Newsletters
Current Issue
Video
Twitter Feed