Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Can Facebook Bring Back Optical Storage?

Like most in the storage business, I haven’t paid a lot of attention to optical disks since Plasmon went into receivership in 2009. Sure, 9GB magneto-optical WORM disks were the medium of choice for compliance archives at the turn of the century, but software-based WORM systems such as EMC Centera and NetApp SnapLock could store more data and access it faster. As magnetic disk and tape capacities steadily increased and optical capacity stalled at 60GB in Plasmon’s UDO2, I figured optical storage’s day had come and gone.

However, at last week’s Open Compute Summit, Facebook Vice President of Infrastructure Engineering Jay Parikh demonstrated a prototype Blu-ray library for cold data storage. Does this mean optical storage due for a comeback? Or is this a solution only Facebook could love?

While most organizations have a constantly growing pool of data that needs to be retained but isn’t actively accessed, Facebook’s endless collection of frat boy selfies and kitty photos dwarfs pretty much anyone else's. Parikh said the company just brought up its first cold storage facility holding 30PB of data and expects to have 150PB of cold data in just a few months.

When you’re dealing with that much data, things like power consumption and datacenter floor space start to become significant issues in your choice of storage systems. That makes nearline media, such as optical disks and tape cartridges permanently ensconced in a library, attractive. Unlike conventional disk arrays, they draw very little power when idle, plus since the media and drive are separate, media costs are significantly lower per GB than disk costs.

Facebook’s jukebox holds over 10,000 100GB Blu-ray disks in a two-tiered architecture with 24 magazines, each holding 36 cartridges of 12 disks. The library’s robot can pick a disk and insert it into one of the library’s Blu-ray drives in under 30 seconds. Facebook didn’t specify the number of Blue-ray drives in the jukebox, but my impression is that it’s just a handful. Parikh said these new optical cold storage systems would cost Facebook half as much as its current cold storage system and use just one-fifth the power.

That current solution uses 16 2u 30 drive Open Vault enclosures and two controller servers per rack for a total of 480 disk drives per rack. Using the highest capacity drives available today -- HGST’s 6TB helium-filled model -- they would store 2.88TB per rack. The current solution is, as disk systems go, already pretty energy efficient in spinning down idle disk drives. In fact, the system can only spin up one disk drive per Open Vault shelf, so only 32 drives per rack can be spun up at a time.

Gio Coglitore, Facebook’s director of hardware design, posted a video demo of the Blu-ray based cold storage system.

[Read Howard Marks' take on the freemium model in "Free Software: A Privilege, Not A Right."]

While the Facebook folks said the media was certified for 50 years, I'm not that confident. Having seen how disappointing CD-R and DVD data endurance has been, to say the least, I just wouldn’t put my data on one optical disk and expect to be able to read it 50 years later.

The four layer Blu-Ray disks Facebook is using have only been around for a year or two. The 50-year expectation comes from accelerated aging that theoretically emulates 50 years of real time. In theory, theory and practices are the same thing. In practice, they’re not.

Even if the media really does hold the bits for 50 years, the rubber parts and lubricants on the Blu-ray drives won’t. The key to long-term data accessibility is being able to get a drive. I’ve long advised my clients that whenever the drives for the media they’re using stop being made, they should start migrating their data.

My question is: why not use tape? A one frame tape library like SpectraLogic’s T680 has 670 slots, which with LTO-6 would add up to 1.7PB. Using IBM TS1140 enterprise drives would boost capacity to 2.68TB, or two and a half times as much as the Blu-ray solution.

If the tapes are in LTFS format, access time to a given file would be a 90 seconds or so compared to 45 to 60 seconds for the Blu-ray system. Finally, both the LTO consortium and IBM have road maps for their tape drives, mapping out another two generations and four times the capacity per cartridge. It’s also good to note that since Quantum still makes LTO-3 drives, and LTO drives can read tapes back two generations, that every LTO tape ever written can be read by a drive still in production.

I just don’t believe that a write once, read 50 years later plan can work. Even if it could, I think other pressures will cause archival data to be migrated to new media every 10 years or so. Historically over a 10-year period, data density on both tapes and disk drives has increased eight times. At that point, data center resource issues simply will make migrating down to one-eighth as much space look more attractive than maintaining old kit.

While I wish Facebook luck, I don’t think optical disk is coming back into the mainstream. For data that is cold enough to go to a nearline solution, the cost and density advantages of tape outweigh any advantage I can see of a Blu-ray disk.

[Find out how flash-based SSDs work and the various ways they can be deployed in Howard Marks' workshop, "Deploying SSDs In the Data Center" at Interop Las Vegas March 31-April 4. Register today!]