Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Insider: De-Dupe Demystified

Data de-duplication is becoming all the rage in backup these days, but not all de-duplication products are created equal, according to the latest Byte and Switch Insider. (See De-Dupe Streamlines Backup.)

The report, Data De-Duplication: Storage Networking Bust-Up, looks at the technology that is also known variously as single-instance storage, capacity optimization, commonality factoring, and data reduction. Under any name, data de-duplication transmits only data that has changed since the last backup. This contrasts with the traditional model of backing up all of the data from every site on a weekly or daily basis.

Data de-duplication is an attractive concept for companies looking to lower bandwidth costs and improve backup performance. (See ADIC in De-Dupe Deal and De-Dupers Demand Disk Mindset.) But like many new technologies, de-duplication raises new issues with customers looking to make buying decisions.

The major differences among products lie in where the de-duplication takes place and the size of segments into which the files are split, says the report. And users should know the difference before buying.

For instance, ADIC, Avamar, and Symantec use software agents on PCs, laptops, and servers to compress data. This keeps data sent over the LAN and WAN to a minimum and reduces backup time. However, this approach requires breaking files into segments and rebuilding them when necessary -- and that takes a great deal of processing power. It also can be harder to deploy this tack, because it takes more time to install on every client backed up.

  • 1