Data centers

09:00 PM

Putting the SAN in San Diego

We take a peek at SoCal's storage testing ground

If you think your firms data growth is overwhelming, then spare a thought for the San Diego Supercomputer Center (SDSC), which uses one of the largest academic SANs to support the Teragrid research network, comprising organizations such as the Oak Ridge National Lab and the Pittsburgh Supercomputing Center.

”You may have the network go offline and that’s annoying, but if we lose someone’s data, they come to your door with a torches and a rope,” says Bryan Banister, SDSC storage manager, explaining that he has to store data for around 100 active Teragrid projects .

The exec recently spoke to Byte and Switch about the challenges involved in supporting a 28-Pbyte SAN, describing his plans for 8-Gbit/s Fibre Channel, SSDs, storage virtualization, and state-of-the-art file systems.

”One of the major projects that we have been working on over the last couple of years has been deploying a really large Wide Area Network (WAN) file system for the Teragrid project,” he says. “Using the IBM GPFS file system technology we’re able to provide over 600 Tbytes worth of file system space for scientific projects to store their data results and other important scientific data collections.”

The exec explained that SDSC is looking to expand the WAN this year with Sun’s Luster Parallel File system, highlighting just some of the cutting edge work that is currently underway in Southern California.

1 of 3
Comment  | 
Print  | 
More Insights
White Papers
Register for Network Computing Newsletters
Current Issue
Research: 2014 State of the Data Center
Research: 2014 State of the Data Center
Our latest survey shows growing demand, fixed budgets, and good reason why resellers and vendors must fight to remain relevant. One thing's for sure: The data center is poised for a wild ride, and no one wants to be left behind.
Twitter Feed