Careers & Certifications

09:00 PM
Commentary
Commentary
Commentary
50%
50%

Things to Keep in Mind With Thin Provisioning

Thin provisioning of storage has evolved into one of the key features that new storage systems must offer

11:00 AM -- Thin provisioning of storage has evolved into one of the key features that new storage systems must offer. It allows you to create array volumes and assign them to applications at the projected capacity the application needs. But, as I discussed in my article on Thin Provisioning Basics, actual capacity is only allocated when new data is created in the system. This can mean a dramatic cost savings in storage acquisition costs.

As is true with most technologies, there are some things to keep in mind when you deploy thin provisioning -- and they dont necessarily include the knee-jerk concern about over-provisioning yourself out of actual disk space. Between the reporting that is available and responsible administration practices, the capacity emergency scenario rarely, if ever, raises its head. Instead there are other things that storage administrators should think about -- including expectations for how thin provisioning works in conjunction with data lifecycle practices.

One such area is block space reclamation, or rather, the lack thereof. The idea behind block space reclamation is to return blocks of storage that are no longer occupied by data so that they can be used again (reclaimed) and assigned to other volumes.

Let's say you create a 500-GB thin volume and things start off as predicted using 10 percent of that capacity, or 50 GB. Good! That's what thin is for -- it saved you 450 GBs or so right off the bat. Way to go.

Over several years, let's assume the volume grows and the used capacity approaches the expected 500 GB. As part of normal lifecycle processes you archive 100 GB of data that is more than three years old to a disk archive or even tape. Good for you! The volume that was thin when it started is now the size it would have been if you had created a fat volume originally. The capacity purchases you made over the years cost less per GB than if you hard-provisioned all the capacity up front. Of course, if the capacity only grows to 350 GB instead of 500 GB, then you would never have had to spend any money on 150 GB of unnecessary capacity. Any way you slice it, it's an excellent lifecycle story.

Previous
1 of 3
Next
Comment  | 
Print  | 
More Insights
Cartoon
Slideshows
Audio Interviews
Archived Audio Interviews
Jeremy Schulman, founder of Schprockits, a network automation startup operating in stealth mode, joins us to explore whether networking professionals all need to learn programming in order to remain employed.
White Papers
Register for Network Computing Newsletters
Current Issue
Video
Twitter Feed