Kevin Fogarty

Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

Does DNA Hold the Keys to the Future of Data Storage?

Last Friday, researchers from Harvard published a paper describing how they were going to change everything in IT and the computer industry with just one new technique. Rather than always relying on data stored using clusters of magnetized grains on a spinning platter, they described a way to attach (fake) human DNA to a microchip to store so much data you could stash every byte on the Internet into a single thumb drive.

Other than seeing microchip and DNA and thinking it was a cool neuroscience experiment, you may have missed the story. Because it's storage. And storage is boring, especially for people who spend their days working on networks or software or in any other geek specialty.

More Insights


More >>

White Papers

More >>


More >>

The breakthrough, in a nutshell, was this: A team of Harvard geneticists working on ways to create a complete, entirely artificial chain of human DNA also discovered a technique to use DNA microchips to store ridiculously high volumes of data in a ridiculously small space.

DNA microchips, also called DNA microarrays, are microchips with strands of artificial DNA embedded in them, linked to the underlying circuitry and fused in place with melted plastic.

They're used mostly in the biotech industry to figure out which genes respond to which stimuli without having to have live cells or live patients right in front of the experimenter.

What DNA does best, however, is store data.

By using DNA sequencers for things they weren't designed to do, researchers are able to record tons of regular data within DNA--customer records, for example, as well as instructions for whether your eyes should be brown or blue.

Using their own techniques--which involved an inkjet printer that produced the synthetic strands of DNA and a couple of days each to do the write and read of data onto the microarrays--the Harvard team was able to store the full 5.27M bytes of data that made up a large genetics text book onto 55,000 strands of DNA, representing less than a thousandth of a gram of material.

Top-quality hard drives can store about 25G bytes per square inch, or between 5G and 6G bits per cubic millimeter, according to a paper in the journal Solid State Technology.

The DNA microchips ginned up by the Harvard team can hold 5.5 petabits, or 5.5 million gigabits per cubic millimeter. That's roughly 110 million percent more than a good hard drive or flash drive can manage today.

Keep in mind these numbers are purely imaginary simply projections of future capacity based on reports from an experiment that succeeded in storing 600 times as much data on one microarray as anyone ever did before, and whose results may never be replicated.

Of course, engineers and storage vendors being the way they are, it's likely that if the Harvard team's results are never matched it will be because commercial R&D teams beat those results like a dead horse, not because they couldn't reach the mark.

Either way, if DNA microarray storage turns into a real, practical, stable, cost-effective way to store data, it will knock the pins out from under the single factor limiting changes in the design of digital hardware--the size of its onboard storage.

Next: The Smaller the Better, When It Comes to the Future of Data Storage

Page:  1 | 23  | Next Page »

Related Reading

Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Research and Reports

Network Computing: April 2013

TechWeb Careers