Your Dedupe Sweet Spot

Running out of storage space? Running out of money? I feel for ya, buddy. Most of us are in a rough spot in the real world looking for a place to cram all our stuff; half the programming on cable channels seems to be about folks de-cluttering their homes, getting rid of stuff they don't need and culling the crap they do keep (10 teddy bears gone-hubby gets to retain one childhood keepsake... belt-buckle collection sold at garage sale- wife gets to keep her 1st place rodeo buckle... you get the i

Joe Hernick

December 8, 2009

3 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Running out of storage space? Running out of money? I feel for ya, buddy. Most of us are in a rough spot in the real world looking for a place to cram all our stuff; half the programming on cable channels seems to be about folks de-cluttering their homes, getting rid of stuff they don't need and culling the crap they do keep (ten teddy bears gone-hubby gets to retain one childhood keepsake... belt-buckle collection sold at garage sale- wife gets to keep her first place rodeo buckle... you get the idea.) It all boils down to making more efficient use of the space they already have while staying under tight budget constraints.

How's your data storage capacity forecast? How about your budget for the next four quarters? If you're like most of us, you're short on spindles and dollars. I'll also bet a laundry list of drivers is pushing your storage trends higher than your two-year old, three-year plan SWAGed. Internal retention and DR/BCP policies, D2D backups necessitated by limited production windows, centralized backups for remote users, multi-site replication, externally driven retention compliance requirements... and that's just at the bottom of your storage pyramid.  How much of that backup and archival data is clutter?

How many identical teddy bears are crammed in your data closets? The promise of data deduplication tech lets you keep one teddy and purge the copies. Different flavors of deduping analyze your data anywhere from byte level up to blocks and files; the level of analytical granularity impacts speed and compression ratios. A recent InformationWeek Analytics survey of 437 IT professionals turned up some interesting results: More than half are either duduping or have a trial under evaluation. Of those, 34 percent have landed between 10 to 1 and 20 to 1 reductions thanks to deduping, 20 percent are somewhere between 20 to 1 and 50 to 1, and 4 percent have found more than 50 to 1 reductions in storage. That's a lot of teddy bears-or files- or blocks or bytes of data. All vendor claims come with the standard caveat that you're your mileage will vary depending on your mix content and your current backup strategy.

One trend you might not be expecting also came out of the survey: 81 percent of those deploying deduplication are running on either Tier 1 or Tier 1 and Tier 2 storage. Think about all those VMs on your SAN. Now think about all the identical system files across all those virtual servers or VDI clients.

Still on the fence about dedupe? Take a look at the this year's bidding war between EMC and NetApp to acquire Data Domain, ending in a $2.1 billion cash deal from EMC. Deduplication is hot. As a storage exec told me (off the record) - anyone can build a "ghetto SATA array" cheaper than an enterprise-level dedupe appliance. Yes, spindles can be dirt cheap nowadays. But deduping can address the clutter in our closets; you can keep buying arrays to store multiple copies of the same stuff, or you can start to clean up your act.  When we asked our non-deduping survey respondents their reasons for holding back, 37 percent were simply unfamiliar with the technology. Only 19 percent listed prohibitive cost. Of those who have implemented, more than half expect an ROI within one year.  

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights