George Crump


Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

Deduplication Moves Beyond Deduplication

While I don't think we ever settled the inline vs. post process debate, the basic blocking and tackling of deduplication seems to be a forgone conclusion. While some will still argue inline vs. post processing, users are now looking for more. What is interesting is how the duplication vendors are now trying to differentiate themselves from each other. 
Data Domain for example today announces enhancements to their replication capabilities. Replication of backup is one of the more impressive side benefits of deduplication. Their product can now cascade replication jobs between DR sites, handle a larger "fan in" during many to one replication and provide improved performance in high bandwidth situations. 
Nexsan, alternatively, recently added power managed deduplication, a first as far as I know. Leveraging a relationship with FalconStor, the product can power down hard drives during off cycles. Power managed deduplication means that the backup jobs and the deduplication clean up work have to get done soon enough so that the drives can be idled. Power efficiency in deduplication has in the past been measured on power provided to real disk backup capacity. If your environment allows for quick backups, then the power efficiency of deduplication can move beyond the efficiency of capacity and on to the efficiency of powered down capacity. 
In backup jobs, high deduplication rates are almost assured. In primary storage where there is, or at least should be, less duplicate data, the going gets a little tougher. It seems that any solution in this space should offer compression, as Nexenta does with their ZFS based product, Storwize with their inline appliance or Ocarina Networks with their out of band optimizer. Ocarina adds deduplication to the process as well as content specific optimizers that provide a greater understanding of the file formats being processed. In addition, they can migrate data and track its location while they are optimizing it. 
Finally from companies like NEC, Permabit and Tarmin, we are seeing more complete disk archive products that can leverage the deduplication engine to improve replication, compliance and address storage scaling issues. While capacity efficiency will be at the heart of the next era of deduplication, the next generation of products will have to leverage the deduplication investment to produce products allowing deduplication to move beyond just deduplication.
The next era of deduplication is going to be a market filled with options for the data center. For now expect to have two or three different deduplication solutions in your environment, but also expect those solutions to do more than just optimize capacity, expect them to add value to other services leveraging their investment in deduplication. 

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Storage Switzerland's disclosure statement.


Related Reading


More Insights


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Research and Reports

Network Computing: April 2013



TechWeb Careers