Howard Marks

Network Computing Blogger


Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

Source vs. Target Deduplication: Scale Matters

I had a nice conversation with the CEO of a backup software vendor, who shall remain nameless, at last week's Exec Event storage industry schmooze-fest. At the event, the CEO asked why I thought target deduplication appliances like those from Data Domain, Quantum and Sepaton were still around. Why, he asked, doesn't everyone shift to source deduplication since it's so much more elegant?

By running in agents on the hosts, source deduplication leverages the CPU horsepower of all the hosts being backed up to do some of the heavy lifting inherent in data deduplication. This should reduce the CPU horsepower needed in the target system and thus hold down its cost. While all deduplication schemes minimize the disk space your backup data consumes, deduplicating at the source minimizes the network bandwidth required to send the backups from source to target.

Since most branch offices run a single shift--leaving servers idle for a 12-hour backup window--and WAN bandwidth from the branch office to the data center comes dear, source deduplication is a great solution to the ROBO (remote office, branch office) backup problem. 

As a result, and because of the generally abysmal state of ROBO backup at the time, early vendor marketing for source deduplication products such as EMC's Avamar and Symantec's PureDisk pitched them as ROBO solutions.

Source dedupe fits well wherever CPU cycles are available during the backup window. If bandwidth is constrained, such as in a virtual server host backing up 10 guests at a time, even better. Since it's just software, the price is usually right. And since vendors have started building source deduplication into the agents for their core enterprise backup solutions, users don't even need to junk Networker, Tivoli Storage Manager or NetBackup to dedupe at the source.


Page:  1 | 2  | Next Page »


Related Reading


More Insights


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Data Deduplication Reports

Research and Reports

Network Computing: April 2013



TechWeb Careers