Stoking The Storage Machine

The storage requirements many businesses are growing like wildfire. And cities aren't exempt from this growth, either. Wichita, Kan., for example, is looking at a conversion of 6 terabytes of

December 8, 2003

13 Min Read
Network Computing logo

Wichita, Kan., used to manage only static data, mainly text. That is, "until we implemented document imaging and digital cameras," says Kevin Norman, the city's IT operations manager. Those two new technologies alone have caused a steady increase in the demand for storage capacity. "They drove our storage requirements up to levels we hadn't seen before," Norman says.

Not only that, business as usual has caused its own headaches. Every government worker stores a little more data than he or she used to, and outside parties make more storage demands, too. For instance, vendors send in drawings for project consideration online.

In response, Norman next year will oversee a conversion of 6 terabytes of storage capacity from a Hewlett-Packard direct-attached-storage framework that uses 80 servers to a storage area network, for increased performance and more-efficient management. Norman hasn't yet decided which vendor will provide the SAN, he says.

After at least three years of spending cuts and spiraling markets, real increases in storage capacity are a major industry trend for 2004, according to users and analysts. Another trend, storage-management automation --for some, the Holy Grail of storage technology--will become real and begin to remove some of the storage burden from customers. A third trend, regulatory compliance-driven storage demands, will join with the desire for more-efficient storage media to create a real addition to the storage infrastructure, known as information life-cycle management. Finally, a protocol for running blocks of data across Ethernet networks will be relegated to low-end requirements.

Customers have spent the last three years burning up the storage capacity they created for year 2000 contingencies, says John Webster, an analyst at market research firm Data Mobility Group. "People in 2004 should be purchasing new storage for the first time in a while," he predicts."I could see into 2004 having to double the space we now have available," says Tim Link, CIO at Ohio State University at Newark and Central Ohio Technical College. Link is anticipating storage-capacity problems as he oversees new capabilities for students and faculty, including video for digital editing and the Helix project, which integrates curriculum design with video on an interactive Web site for faculty. "It's just going to take off as the faculty realize what they can use," he says.

Jeremy Burton, chief marketing officer at Veritas Software Corp., a business-technology-management software vendor, doubts that most companies will have double the money next year to spend on storage. But many are at least consolidating the number of storage devices they maintain in favor of newer, more-efficient systems. And IT dollars are being allocated for the process. "I hear that IT budgets are more predictable than they used to be," Burton says.

Kent Kilpatrick, IT manager at Fiserv Credit Processing Services, a business-to-business credit-management company, maintains 5.2 terabytes of data on an IBM Enterprise Storage Server and expects to have more than 8 terabytes of storage capacity in use by next year. "The [bad] economy can be a double-edged sword for us," he says. "Sales go down, but people put more purchases on credit cards." Kilpatrick hopes to get to 10 terabytes of storage within the same IBM box by adding an expansion frame. Storage area networks are groups of servers connected with pools of storage devices via Fibre Channel. CarePlus Medical Centers LLC maintains 11 health-care centers across Dade County, Fla., serving 300,000 patients. Bill Bounds, director of IT at CarePlus, has the approval to deploy an IBM-based SAN next year with about 7 terabytes of capacity. "We also want a redundant data center for failover, and I expect electronic records by midyear," Bounds says.

Electronic records and imaging will be very important to the health-care industry in the coming years, and Hackensack University Medical Center expects to increase capacity from 28 terabytes this year to 62 terabytes next year, supporting new efforts ranging from radiology to scanned images for employee IDs. The medical center hopes to leverage the Hewlett-Packard Enterprise Virtual Array storage system it has in place. "Instead of islands of storage, we'll be storing data in a central place," says Ed Martinez, the medical center's IT director. "The big advantage is centralized management, rapid information recovery, growth on demand, and adjusting to new types of data."

Jay Kidd, VP of marketing at Brocade Communications Systems Inc., says falling storage-network component prices--including Brocade's--should attract customers who haven't bought before, particularly in small and midsize businesses. "When the economic grueling winter began, SAN was still new," Kidd says. Next year, small and midsize business customers "can look at success at larger companies and match that with falling prices."In the case of Union Bank of California, the business was using only 13 terabytes of 27 terabytes of capacity. Next year, Rick Curry, VP of enterprise server support at the bank, hopes to operate storage as a generic resource. "We plan to take back ownership of the physical resource and turn it into a utility service," Curry says.

Union Bank stores everything on high-end Symmetrix storage devices from EMC Corp. and uses Brocade switches to keep storage separate from the servers. And Curry hopes someday to run some applica- tions at the network level with Brocade. "In 2004, we hope to eliminate redundancy in software licensing, dive into midtier storage, and come up with charge-back processes for the business units," he says.

Big jumps in capacity would be a nightmare without management tools to help, Webster says, but 2004 should also be the year when these tools come of age. "Management is coming together now, and small vendors--like AppIQ and Creek Path--are helping to automate the storage network," he says. "We should also experience the acquisition of the little guys by big guys."

Virtualization--combining multiple storage arrays into a layer providing a single view--is what customers should look for, says Jeff Barnett, manager of market strategy at IBM storage software. Virtualization will mean intelligence to pool together storage resources and find underutilized capacity on existing devices, Barnett says. "If the resources are broken up into 10 new arrays, that's not useful. With virtualization, it could all be used as one continuous space."

Still, some storage users are sticking with the systems they have. Terry Gaasterland, head of the laboratory of computational genomics at Rockefeller University, a biomedical research institute, is working with more genomes, creating larger amounts of data. "We already grew from 300 Gbytes to 3 terabytes, and I expect us to reach 10 to 20 terabytes over the next two to five years," Gaasterland says. The lab will count on the existing Linux-based network-attached-storage technology it bought from startup vendor Panasas Inc. earlier this year. "We'd be spending twice to 10 times as much on storage if it wasn't for Panasas," Gaasterland says.And Panasas hopes to make management easier and more precise in 2004. "Next year, we will expose more capabilities around managing," says Paul Gottsegen, VP of marketing. "We will offer policy-based management per file, recognizing attributes for every file."

Another vendor expects a big market for the new breed of storage required for blade and Linux servers. David Scott, founder and CEO of 3PAR Data Inc., says utility storage, built from industry-standard components, masks complexity and scales easily in response to business needs. "Customers have had two or three years sweating assets, but they're coming out of the phase when they've looked at cutting costs," Scott says. "We plan to at least double performance in response."

Customers across markets could best counter capacity complexity by signing on to a utility approach for easier management, says Bob Schultz, general manager of storage at HP. "Customers in '04 will still be getting a handle on their costs for IT," he says. "SAN customers will look at constant growth, realize that disks just get cheaper, and wonder why they should pay up front."

A 3PAR customer primarily went with the vendor because the utility model offered lowered costs and improved efficiency. Next year, Adam Unger, senior principal of business technology at American Management Systems, a systems integrator, wants to improve disaster recovery but doesn't want to introduce complexity with an additional private network. "We'll look for remote mirroring on native IP and expect some mainstream mirroring app from 3PAR inside the box," Unger says.

Executives at The First Years Inc., a developer of products aimed at better parenting, were tired of the costs associated with backing up data. A loader was needed for each server, and a person had to rotate the tapes and send them off-site. "We used a tape drive for every backup," says Ron Cardone, First Years' senior VP of IS. The company deals with large graphic files, and that's what led to the use of so many tape drives. "We were spending $7,000 per drive," Cardone says.While First Years was deciding what to do, Cardone got a call from ExaGrid Systems. ExaGrid buries intelligence into network chassis that operate over the existing IP network with industry-standard third-party software products. Each chassis acts as the node of a cluster, so customers get high-availability benefits. ExaGrid adds automatic data migration, file-corruption protection, and a self-healing architecture.

As part of a beta program, Cardone oversees a storage cluster across two buildings. "We back up on a rack, and software allows us to schedule when we want to back up data and archive it," he says. Cardone will be happy to eventually do away with tape after he rolls out ExaGrid next year. "Large tape libraries are expensive, work for a year or two, and then they fail often," he says. "We should be much more cost-effective without them." Another storage user is looking forward to passing some of the management responsibilities on to other people. Robert Young, WAN administrator at MTS Systems Corp., a mechanical testing company, says the company's backup and recovery function "was given to me because the person who had it left, and I was the only one dumb enough to take it."

Young counts on software from Altiris Inc. to automatically back up user data. In case of an outage, Altiris lets users roll data back to a specific point in time, as well as automatically backing up employee systems. Eventually, it will become a help-desk function: All of MTS's 1,000 employees are supposed to be covered by the Altiris software next year. "The biggest roadblock to that is our limited staffing in the face of rapid company growth," Young says. "We'll roll out Altiris in '04, right after we roll out Microsoft Active Directory."

Mark Magee, a segment manager for clients at Altiris, promises an easier-to-use tool next year. "We plan to cut two steps down to one for recovery," he says. "We'll integrate our Recovery Solution with [our] Software Delivery and PC Transplant" products.

A tight overall budget for the city of Tacoma and Pierce County in Washington means fewer police officers on the street, and they'll need the utmost in information capabilities to do their jobs right. Mark Knutson, assistant director of IT at the city's Law Enforce- ment Support Agency, expects to manage an explosion of data from digital photos, electronic attachments, and scanned documents. "We'll count on appliances from BlueArc Corp. because of their ability to move management software into the network fabric, wiping out management complexity," Knutson says.Geoff Barrall, chief technology officer and founder of BlueArc, says too many customers still use the same tools for tens of terabytes of data they used when they had 60 Gbytes of capacity. "Management of data is key to all companies, and we have silicon servers for replication and storage," he says. "We'll make sure our file systems take as much out of the management process, including allocation and movement, as possible."

Management remains at the center of the third storage trend for 2004, but it encompasses even more. The regulatory environment and the threat of litigation make companies' data-management policies even more important, says Wayne Rickard, chairman of the Storage Networking Industry Association technical council. Regulations have people thinking about additional tiers of storage and automation to get data in and out in the most efficient and effective means possible, says Data Mobility analyst Webster.

If compliance is the primary driver of the desire for improved data accessibility, "the second driver is the concept of classifying and managing data by its type," Rickard says. "Some call it information life-cycle management, but it's just focusing on data rather than storage components."

Webster says information life-cycle management is "still in hype stage," but that doesn't mean the concept doesn't have resonance for users looking for a storage-management strategy. Information life-cycle management "is a push item for vendors, but top management among users are saying we need to get a handle" on growing storage needs, he says. Norm Fjeldheim, CIO and senior VP at Qualcomm Inc., says he's actively pursuing information life-cycle management, rather than waiting for vendors to advance it. He'll work with all his storage partners but is leaning toward Hitachi Data Systems for hardware and Veritas for software. "We want to differentiate the data for how many times and what media we back up to," Fjeldheim says. "We have automated systems for backup but just want to do it better and more efficiently." Qualcomm couldn't wait around for outside help next year. "We've been trying some of this for years, using some of our own tools," Fjeldheim says. "Managing all the data we have now, we had to gain control so we don't continually throw money at disks."

Hitachi Data Systems plans to unveil software next year to help customers with indexing--for example, to retrieve all E-mail messages between two people for the previous seven years. "Some of the largest customers will deploy tiers of storage," predicts Claus Mikkelsen, senior director of storage apps at Hitachi.Storage heavyweight and Hitachi competitor EMC plans to be a leader in information life-cycle management next year, says CTO Mark Lewis. "Customers don't come up and say they need ILM," he says. "But management says, 'Help us control costs, manage the information, and deal with compliance rules.'" The growth of data makes it very hard for customers to sort out retention, reuse, or archiving for compliance, Lewis says. In light of its acquisition of Documentum Inc., EMC will work hard next year toward an application-centric view of information life-cycle management, in addition to finding the lowest-cost form of media for every stage of the data life cycle, from creation to deletion. "As we look at the pure management of the element, the people cost will continue to be very high compared to the cost of the technology," Lewis says. "So we'll work to automate the environment."

HP, IBM, StorageTek, and Veritas will join EMC and Hitachi with information life-cycle-management products and strategies next year, says Jack Scott, analyst and founder of the Evaluator Group. "ILM will be the trend, and a piece of it is compliance," he says. "The other piece will be matching storage to the value of the data."

Jim Doedtman, technical planning manager at OSF Healthcare Corp., is dealing with exponential data growth in the health-care industry and fighting a battle over how much his company pays for storage. Doedtman is ready for enterprise data classification and hopes to discover what each business process--for example, a document-imaging system with dedicated cache--costs in storage. He's confident he'll have a choice of vendors; he's focused on a wide range of EMC storage systems separated by 75% in costs. "It caused us to take a hard look at ILM to have data on the right platform at the right time," Doedtman says. He says he'll start by maximizing some fixed-content data on storage that costs about 4 cents a megabyte, then move the other 80% of data to a penny-a-megabyte system.

All companies will want the right data in the right place at the right time, says Jon Benson, VP and product line manager for tape automation at StorageTek, whether it's on high-end Fibre Channel disk drives, serial ATA disk drives, or tape media--but they shouldn't lose track of what remains paramount. "Consolidation and standardization will continue to be big in '04," he says. "While customers turn multiple data centers into one, and they peel the onion on consolidation, 24-by-7 operations are what matters, down to the last copy in the ILM architecture."

Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like

More Insights