Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Converged Network Reins in Runaway Cabling

Expansion is evident everywhere in the University of Alabama at Birmingham's data center. The number of servers, switches and storage systems has been growing as the academic institution has delivered more services to its users. As a result, cabling was strewn throughout the data center, and the need to simplify its wiring was a central reason why the higher educational institute decided to install an integrated computing solution.

The University of Alabama has 30,000 undergraduate, graduate and professional students, as well as 18,000 faculty and staff members. Its campus spans 80 city blocks and features a biomedical research center and a hospital. Two-thirds of the students are from the state, and the academic institution generates $1.8 billion toward Alabama's economy. To support its mission, the education enterprise operates a data center that relies on Dell Linux servers as well as Oracle's Solaris solutions, which support Oracle databases and important applications such as the institution's Student Information System. The academic entity has about 200TBytes of Tier 1 and Tier 2 storage, supported by Brocade, Direct Data Network and Hitachi Data Systems storage solutions.

In the winter of 2009, the university found out that its Brocade (older McData storage solutions) Fiber Channel storage systems were about to be phased out. Rather than continue with a Fiber Channel approach, the university decided to change the design of its storage area network (SAN). "We wanted to move to Ethernet, so our data center infrastructure would be more cohesive," notes Bob Cloud, executive director of IT infrastructure services at the University of Alabama.

There were a number of inefficiencies in the previous approach. Having more than one type of network infrastructure meant that technicians had to be trained to operate different types of network equipment. In addition, there was little (oftentimes no) integration among the university's different management tools, so technicians had to bounce from one screen to another to identify the cause of problem connections. Also, running the proper cabling had become quite problematic. The traditional approach to data center expansion often meant that a customer had to purchase additional servers, switches and storage systems whenever its processing requirements increased. Because so many devices needed to be connected, the university was spending millions of dollars on cabling each year.

The new integrated systems connect autonomous devices in the chassis rather than require running cables to all systems. "We thought moving to the new infrastructure would cut our cabling costs in half," notes Cloud. The potential savings were important because the cost for the new integrated devices often runs into seven figures.

  • 1