Capacity Planning: View From the Trenches

Storage managers rely on a range of tools - including intuition -- to plan storage capacity

March 5, 2008

6 Min Read
NetworkComputing logo in a gray background | NetworkComputing

It's difficult -- if not impossible -- to generalize about storage capacity planning. Talk to a storage customer, and it's evident there are at least as many approaches as there are storage setups. Talk to a vendor, and prepare to have the parameters of the conversation blown out to include the alpha-to-omega of data management -- including policy networking, tiered storage, and lifecycle management.

A few basic tenets apply: The old mainframe-era adage of maintaining 50 percent to 60 percent capacity thresholds remains the conventional wisdom with SANs and other networked storage. But some other technical and business realities have emerged to complicate things -- namely, the advent of virtualization, increased IOPS (input/output operations per second) demand, and the ability to tie storage consumption to business requirements or strategic initiatives.

In an effort to discern some common best practices, Byte and Switch turned to those in the trenches for input. Following are comments from storage managers about their most efficient means of predicting storage requirements for IT budget planning.

Use multiple measurements

Excel spreadsheets are perfectly adequate for online media site Alibris when it comes to capacity planning, says CTO Michael Schaffer. It's what you put into the spreadsheet that counts. And Schaffer is interested in two major metrics: raw storage capacity and performance (IOPS and throughput).

"Our business drives growth in the demand for these services. We have a multiyear financial plan (in dollars), which we map to order volume and inventory stats, like number of media vendors in our network and the number of SKUs in our catalog," Schaffer says. "The vendor count, SKU count, and order count all drive our expected storage needs, in Gbytes and IOPS."There's one other variable in the capacity planning equation, too: time, or how long it takes to acquire and integrate additional capacity. "We try to have an extra 12 months of capacity," says Schaffer, noting that his 3PAR SAN makes it very easy to add capacity quickly and painlessly.

Viewed through that prism, Alibris's requirements for raw bytes has remained relatively steady, while Schaffer finds that IOPS are being consumed more rapidly. He attributes this to inventory churn. "Our business is seasonal -- huge August, then slow autumn, good Christmas, huge January. But we've learned to model it after all these years," he explains.

Predict what's affordable

Excel's also the main tool for managing capacity at Indiana University's Kelley School of Business in Bloomington, Ind. But the bottom line is the ultimate gauge for Jared Beard, associate director of the B-school's IT labs.

"You anticipate capacity and predict it, but when it's all said and done, capacity is determined by the budget meetings and how much they can afford," says Beard. "We¹re fortunate in that our dean sees IT as a priority, and if a new service is seen as a priority, the money will typically be found for it."

Beard's approach to increasing capacity starts with replacing basic SAN increments as quickly as possible, with a view to less obvious storage requirements as well as blatant ones. "You can't just add 2 Tbytes on the shelf and not have 2 Tbytes more to back it up. Anticipating all that is really tough," he says.Beard also tries to anticipate storage needs based on past patterns. "Every time we add a new service for our customer base -- faculty and staff -- they take it to an extreme, and we have to triple our capacity to handle it."

Beard is currently swapping out 300-Gbyte shelves in his HP SAN with 750-Gbyte units, as fast as budgets allow. The new storage is needed for all kinds of video applications that are consuming overhead almost as quickly as it can be deployed.

Beard's got an eye on reclaimed capacity from servers that have been consolidated under VMware, as the school makes wider use of virtualized servers and virtual machine.

So far, the combination is working, but there is still a lot of flexibility required. "There's no hard and fast rule," Beard says, about how much overhead to have.

Use quota power – and trust the gut
Steve Damadeo, IT operations manager for Festo Corp., a Hauppauge, N.Y.-based manufacturer, believes in the power of storage quotas and thresholds for individual users. And that's helped keep the company's storage growth under 20 percent for the last several years."We set a quota for each individual, and they get 'X' amount of space -- when it gets to about 90 percent, they get an alert, and we'll push it up till it gets to about 2 Gbytes per person," Damadeo explains. If they hit 90 percent usage again, the user gets a request from IT to go in and see what can be deleted. "Many people inadvertently store Webcasts or media files -- not musical collections, though there are a few of those, but product videos, like a 300-Mbyte file that's never been opened," he laughs. "But they think they might need it."

Damadeo also uses a maintenance and analysis tool from NTP Software to help assess Festo's storage needs. "We use it to track trends," he says, but an internal policy on growth has proven an effective brake on projects or users that threaten to gobble up disk space.

"We operate with a 20 percent [growth] rule, so we're able to slow the growth with our [2-Gbyte] quotas and how we handle them," Damadeo says. "It's not that we don¹t want people to store files, but 90 percent of the files stored are unused for over a year, or are infrequently accessed." (Does that complaint sound familiar?)

Damadeo also doesn't allow voracious users to dismiss their consumption with another common refrain: "Oh, storage is so cheap." Apart from the cost to acquire capacity, he reminds them of the soft costs associated with storage, such as the additional time spent adding capacity and doing backup, which can be a huge time-suck.

Festo also uses system reports of available disk space for its primary file systems, which are about 9 or 10 Tbytes. But Damadeo reports he's still working with essentially the same SAN he bought in 2004. "Am I looking at the next step? Yes. Do I need to do it this year? No."Damadeo believes in using a combination of metrics and gut feel to handle capacity planning, and, so far, that system has worked well. "You've got to be two years ahead of your business, otherwise where you are and where the business is going can converge, and IT can't provide the necessary capacity,"he says.

Have a comment on this story? Please click "Discuss" below. If you'd like to contact Byte and Switch's editors directly, send us a message.

  • NTP Software Inc.

  • 3PAR Inc.

  • VMware Inc.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights