Data centers

11:00 PM
Connect Directly
RSS
E-Mail
50%
50%

Calculating a Use Case for SSDs

There is a process to follow to ensure that an investment in solid-state disks is useful and will pay for itself quickly

Solid-state disks (SSDs) are an investment that requires serious forethought and a deeper understanding of your storage architecture. How does an IT manager know when to invest in SSDs, and how can he feel confident that the investment is going to pay off? These questions are becoming more relevant as the cost of SSDs has continued to decline over the past few years. This decline has increased the number of application workload scenarios that would be well suited to SSDs.

Investing in SSDs is not the roll of the dice that many believe. There is a process to follow to ensure that you invest in SSDs at the right time to deliver maximum benefit to your organization and know before the product is implemented that it will pay for itself quickly.

The first step is to gather the statistics about your environment. You need information not only on your storage I/O but also on the application servers. For most environments, the tools to perform the analysis are free and readily available. Most UNIX environments can use IOstat and Windows environments can use Perfmon.

A good next step is to look at CPU utilization. As a rule of thumb, if your average CPU utilization is over 33 percent then you more than likely have a bottleneck somewhere else in the application stack. Hitting 33 percent or below indicates that the server CPUs are waiting on something else -- and that something else is very often storage.

To identify if storage I/O performance is truly the issue, the next step is to find a task on the application server that is continuously doing something with the disk. In this task, we want to examine disk queuing or hard-disk queue depth. Queue depth is the number of commands a device is holding in its command queue. When we are discussing if SSD is right for you, the key concern is not filling up the queue and locking up the server. Instead the goal is to ensure that the storage array efficiently handles the queue depth. Perfmon shows the queue depth as a stand-alone statistic; IOstat represents this as percent of utilization, essentially the queue depth x 100 percent.

Previous
1 of 3
Next
Comment  | 
Print  | 
More Insights
Slideshows
Cartoon
Audio Interviews
Archived Audio Interviews
Jeremy Schulman, founder of Schprockits, a network automation startup operating in stealth mode, joins us to explore whether networking professionals all need to learn programming in order to remain employed.
White Papers
Register for Network Computing Newsletters
Current Issue
Research: 2014 State of the Data Center
Research: 2014 State of the Data Center
Our latest survey shows growing demand, fixed budgets, and good reason why resellers and vendors must fight to remain relevant. One thing's for sure: The data center is poised for a wild ride, and no one wants to be left behind.
Video
Twitter Feed