It was a huge deal both financially and in terms of operational significance; the data center floor was swept of one vendor's equipment and replaced by the new, cool stuff that I was representing. Delivery necessitated the closure of a London street on a Saturday morning in order to have a crane lift the equipment to its appointed floor. And the punch line? The total capacity was 10 GB. It is amazing when you think that today I can get a mobile electronics flash card that's about the size of a fingernail and holds 64 GB!
My point is not just about the world moving on in absolute capacity; there's a relative story here, too. After all, at the time, that 10 GB represented "big data". Those of you who were around then will recall a serious debate within the industry about how many gigabytes a single-capacity planner could realistically manage. Maybe two, perhaps four? But, to repeat, the world has moved on. Or has it?
Today, there's lots of talk about big data. It is typically associated with, or followed by, a word such as "analytics" and its precise meaning is all rather mysterious--but it definitely refers to processing and storage needs that exceed the abilities of standard platforms. Partly it exists because--at least according to conventional wisdom--many of the issues associated with "regular" large-scale storage management apparently have been solved by modern storage software: thin provisioning, automation, what-have-you.
[ Read Big Data Projects: 6 Ways To Start Smart. ]
Of course, not everyone has the skills, budget, or scale to afford or deploy such functionality. But one could surely be reasonably safe in assuming that "regular big data"--just the amounts, without the analytics--is no longer much of an issue in the largest IT enterprises.
Our research shows that, surprisingly, this is not the case. Instead, it looks as if data growth still is outpacing the ability of advanced storage to manage and tame it. This is because organizations with higher amounts of storage capacity not only still rank "managing data growth" (in a tie with "improving backup and recovery") as their top overall IT priority, but they also are significantly more likely than smaller operations to be hiring storage administrators. For comparison, in smaller IT operations--those 50 TB and under--"managing data growth" is only sixth on the IT priorities list. Does this mean that all those advanced storage features that have been designed to optimize the management and protection of data are not living up to their billing? Or does it perhaps mean that these larger organizations simply aren't using them effectively, or at all?
Every year, ESG conducts an IT Spending Intentions survey. As one might expect, the top IT priorities for 2012 include--as they have for three straight years--such things as the increased use of server virtualization, improved backup and recovery, major application deployments or upgrades, managing data growth, and information security. However, for 2012, "managing data growth" rose to the top overall spot among organizations that have at least 500 TB of storage capacity under management. There is also a greater emphasis on improving backup and recovery processes among these same organizations. These two factors suggest that, despite a plethora of advanced storage and data protection features and tools, organizations with significantly higher storage capacities are still struggling to keep everything in check.