Insider: De-Dupe Demystified

Report takes a hard look at early de-duplication products, weighing their pros and cons

May 5, 2006

3 Min Read
Network Computing logo

Data de-duplication is becoming all the rage in backup these days, but not all de-duplication products are created equal, according to the latest Byte and Switch Insider. (See De-Dupe Streamlines Backup.)

The report, Data De-Duplication: Storage Networking Bust-Up, looks at the technology that is also known variously as single-instance storage, capacity optimization, commonality factoring, and data reduction. Under any name, data de-duplication transmits only data that has changed since the last backup. This contrasts with the traditional model of backing up all of the data from every site on a weekly or daily basis.

Data de-duplication is an attractive concept for companies looking to lower bandwidth costs and improve backup performance. (See ADIC in De-Dupe Deal and De-Dupers Demand Disk Mindset.) But like many new technologies, de-duplication raises new issues with customers looking to make buying decisions.

The major differences among products lie in where the de-duplication takes place and the size of segments into which the files are split, says the report. And users should know the difference before buying.

For instance, ADIC, Avamar, and Symantec use software agents on PCs, laptops, and servers to compress data. This keeps data sent over the LAN and WAN to a minimum and reduces backup time. However, this approach requires breaking files into segments and rebuilding them when necessary -- and that takes a great deal of processing power. It also can be harder to deploy this tack, because it takes more time to install on every client backed up.Asigra and Data Domain use software on appliances set behind the backup servers. These devices shoulder the de-duplication processing instead of the servers. Diligent Technologies' virtual tape library (VTL), for example, runs on a Linux server to collect data from backup servers and copy it to a VTL. But critics argue that this approach simply reduces the capacity needed in the storage pool without directly addressing the bandwidth problems associated with sending large amounts of data over the SAN.

Another important factor is the size of the file segments. Smaller segments require more pointers and a larger index, which can slow backup times. However, smaller segments usually translate into a better compression ratio as well.

Other factors that affect de-duplication performance include whether compression is done at the block or byte level, the compression technology used, integration with backup software, and price.

The report takes an in-depth look at de-duplication products from ADIC, Asigra, Avamar, Data Domain, Diligent, and Symantec, including interviews with customers using de-duplication.

Dave Raffo, Senior Editor, Byte and SwitchOrganizations mentioned in this article:

  • Advanced Digital Information Corp. (Nasdaq: ADIC)

  • Asigra Inc.

  • Avamar Technologies Inc.

  • Data Domain Inc. (Nasdaq: DDUP)

  • Diligent Technologies Corp.

  • Symantec Corp. (Nasdaq: SYMC)

    Data De-Duplication: Storage Networking Bust-Up is available as part of an annual subscription (12 monthly issues) to Byte and Switch Insider, priced at $1,350. Individual reports are available for $900.

    To subscribe, or for more information, please visit: www.byteandswitch.com/insider.

    To request a free executive summary of the report, or for details on multi-user licensing options, please contact:

    Jeff Claudino
    Sales Manager
    Insider Research Services
    619-229-9940
    [email protected]For review copies, members of the media may contact:

    Gabriel Brown
    Chief Analyst

    Insider Research Services
    44-20-7701-9330
    [email protected]

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights