Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

Analysis: Data De-Duping


Just a few years ago, disk-to-disk backup seemed almost too good to be true. Powered by inexpensive ATA (and later SATA) disk drives, D2D, whether implemented as virtual tape libraries or as a backup-to-disk option in your favorite backup application, made backups faster, eliminated mechanical failures in tape drives and libraries, and made it easier to deal with the continuous chorus of calls to the helpdesk for individual file restores.

Today, our disk-backup devices are filling up, and there's not enough space or power in the data center to add another petabyte of backup space, so we're keeping only two to three days' worth of backups on disk, when we'd like to keep a month's worth. Problem is, there's too much duplicate data in our backup sets. The good news is, vendors--smelling money, of course--are promising that their new data de-duplication products can provide 20-to-1, even 300-to-1 reductions in the amount of data we need to store. Can it be? Let's take a look.

De-duplication technology lets you store more backup data on a given set of disks. This can extend the period you keep disk backups and reduce your data center power and cooling costs. If you de-dupe data before sending it across the WAN, you can save on bandwidth, making online off-site backups practical at companies that used to rely on tape. The only drawback to data de-duplication is that it can slow down the backup process.

Continue Reading This Story...


Click image to view image

bullet Content-Addressable Storage
We discuss the drivers for implementing CAS, as well as what the technology can and cannot do.

Page:  1 | 23456789  | Next Page »

Related Reading

More Insights

Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Research and Reports

Network Computing: April 2013

TechWeb Careers