Storage

09:42 AM
Howard Marks
Howard Marks
Commentary
Connect Directly
Twitter
RSS
E-Mail
50%
50%

When Hashes Collide

If there was any doubt in my mind that data deduplication is a mainstream technology, it was wiped out when I saw--in the business section of The New York Times last week--a full-page ad from Symantec touting its deduplication technology. Even so, I still occasionally run into people who consider deduplication to be a dangerous form of black magic that is likely to mangle their data and end their careers. This attitude represents an overestimation of the likelihood of a hash collision in dedupli

If there was any doubt in my mind that data deduplication is a mainstream technology, it was wiped out when I saw--in the business section of The New York Times last week--a full-page ad from Symantec touting its deduplication technology. Even so, I still occasionally run into people who consider deduplication to be a dangerous form of black magic that is likely to mangle their data and end their careers. This attitude represents an overestimation of the likelihood of a hash collision in deduplication and of the reliability of more traditional backup media.

First, let's look at the reliability of the other components in your storage system. Today's hard drives are rated to fail to read a sector once every 10^14 to 10^16 bits (100 to 10,000TB). As a backup to detect read errors and allow the array controller to rebuild the data from an Error Checking and Correction (ECC) stripe, enterprise drives add a 16 bit CRC (Cyclical Redundancy Check) in the T10 Data Integrity Field (DIF) that will itself fail to detect one in 64K (65536) errors. As your data travels across an Ethernet or Fibre Channel network, it is error-checked using a 32-bit CRC (Cyclical Redundancy Check), which will return the right value for the wrong data 1 in 10^9 times.

Finally, if you're avoiding deduplication because you don't trust it, you write the data to an LTO-5 tape, which has an error rate of one in 10^17. Well, one in 10^17 sounds great! I mean, the odds of winning the Powerball lottery are two in 10^8. LTO-5 error rates are a billion times better than that! Of course, the spec sheet also says that that's for non-media errors, so errors caused by tape mishandling, overuse and the like aren't included or calculable.

So how do those reliability levels compare to a typical deduplicating backup target? Among hash-based deduplicating systems, SHA-1 is the most commonly used hash function. With a 20-byte hash value, the odds of any two blocks generating the same hash from different data are about one in 10^48, which anyone will admit is a really big number. Of course, what we're worried about is the odds of two blocks in our data center generating a hash collision, and that depends on the amount of data in the deduplication universe.

As my friend W. Curtis Preston says, it's more likely that, on any given day, Jessica Alba will come running to me to be mine forever than that two blocks in my data will wrongly generate the same hash. The former is possible, after all. Ms. Alba and I are both alive, but, given the fact that I'm an old, fat, geeky guy in New Jersey and she's, well, Jessica Alba, it's highly improbable.  

Howard Marks is founder and chief scientist at Deepstorage LLC, a storage consultancy and independent test lab based in Santa Fe, N.M. and concentrating on storage and data center networking. In more than 25 years of consulting, Marks has designed and implemented storage ... View Full Bio
Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Audio Interviews
Archived Audio Interviews
This radio show will provide listeners with guidance from Dell Storage experts, who can help you explore ways to simplify workload management while achieving a balance of price and performance.
Slideshows
White Papers
Register for Network Computing Newsletters
Current Issue
Video
Twitter Feed
Cartoon