Storage

08:47 AM
George Crump
George Crump
Commentary
50%
50%

Time - The First Casualty Of Lack Of Storage Analytics

In IT, time is no longer on your side. Staffs are stretched too thin, new products and capabilities are coming at you too fast and requests for additional storage performance or capacity never stop. The problem is that there is a limited amount of time available to you to examine what is causing a performance bottleneck or how much capacity an application really needs. The current trends of virtualization, tiered storage and infrastructure consolidation make diagnoses even more challenging.

In IT, time is no longer on your side. Staffs are stretched too thin, new products and capabilities are coming at you too fast and requests for additional storage performance or capacity never stop. The problem is that there is a limited amount of time available to you to examine what is causing a performance bottleneck or how much capacity an application really needs. The current trends of virtualization, tiered storage and infrastructure consolidation make diagnoses even more challenging.

Often the first solution that your storage vendors will offer when you present them with a performance or capacity problem is to throw more hardware at it. This additional hardware or software then costs you more time. There is the obvious time cost during the implementation process of the product, and then there is the less obvious time cost of managing another storage area or software task.

Often these so-called solutions are just slapping a bandage on the problem, but all these layers of fixes are making your management burden heavier. In fairness, there are times where a quick fix is all that you can legitimately afford or have time to apply. Its okay as long as you know that's what it is, but there are times when applying new hardware or software does solve and eliminate a performance problem.

Knowing the difference and knowing what to apply is the hard part. Having the time to understand the nature of the problem and even being able to predict a problem prior to it occurring, especially in the dynamic environment that a data center has become, is a luxury most organizations do not have. We have moved well beyond the days were a spreadsheet is a useful aid for managing the storage environment. What is needed is real-time or near real-time analytics.

This is where having analytical tools like those offered by Virtual Instruments, Vizioncore, Dynamic Ops and Tek-Tools can help. These tools can monitor and diagnose problems in your environment and help you track down the root cause. Many times performance problems are caused by improper configurations that relatively simple tuning could have solved. They can also examine the performance characteristics of virtual servers, physical servers and tiers of storage to make sure you have the right applications and data on the right platform.

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, ... View Full Bio
Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Audio Interviews
Archived Audio Interviews
This radio show will provide listeners with guidance from Dell Storage experts, who can help you explore ways to simplify workload management while achieving a balance of price and performance.
Slideshows
White Papers
Register for Network Computing Newsletters
Current Issue
Video
Twitter Feed
Cartoon