Panasas Rocks Stanford

Computing center replaces NFS with 'neo-NAS' system that works with open-source clustering

August 27, 2005

3 Min Read
Network Computing logo

The Institute for Computational and Mathematical Engineering (ICME) at Stanford University claims it's found high-performance storage muscle without a SAN, thanks to a cluster-friendly, network-attached system from Panasas Inc.

For the last six months, ICME has used that vendor's ActiveStorage system to power high-performance computing (HPC) clusters based on Rocks open-source code.

"We needed a solution that was easily integratable into our Rocks cluster," says Steve Jones, technology operations manager at ICME. Rocks is an open-source software suite that generates a common database and management system for Linux clusters. It was developed in part by the San Diego Supercomputer Center and is available online.

Early this year, things had hit the wall for the ICME cluster, which includes 164 processors running on 82 computers on campus. The cluster is used to conduct sponsored research for a range of government agencies, including the Defense Advanced Research Projects Agency (DARPA).

Panasas came into the picture when ICME wanted a cluster-oriented storage system that wouldn't involve a SAN."We try to purchase cheap solutions," Jones acknowledges. The group needed better storage, but felt that per terabyte, the cost of a high-performance SAN, including training time, didn't make economic sense.

But something had to be done. The group was finding it tougher to handle simulations with hundreds of thousands of possible iterations using two RAID-adapted JBOD systems and NFS filing. "It was easy to saturate," Jones recalls. While splitting users across the two disk systems worked for awhile, it wasn't long before ICME needed more.

Plugged into the California supercomputing scene, Jones knew whom to call. And through ICME's relationship with Dell Inc. (Nasdaq: DELL) he made further contacts.

He did the vendor dog-and-pony. "They get you there, promise you the world, and then you have lunch and they show you a bunch of PowerPoints." After viewing solutions from Panasas, DataDirect Networks Inc., EMC Corp. (NYSE: EMC), IBM Corp. (NYSE: IBM), and Ibrix Inc., he settled on Panasas.

Jones claims dramatic performance improvements with the Panasas file system, PanFS. Using NFS under the old system resulted in 17.80 Mbytes/second for concurrent writes using NDS with eight dual-processor jobs. Using PanFS results in 154 Mbyte/s. Likewise, users got 36.97 Mbyte/s during the read process with eight dual-processor jobs using NFS; corresponding performance with PanNF was 190 Mbytes/second.Besides performance, ease of integration was a key factor in Jones's choice. Simply put, it was a speedy setup that didn't require extra training. "I just finished setting up a new cluster the other day," he says. "The Panasas shelf integration was 1 hour 55 minutes from the time I opened the first box to fully operational status on the cluster."

Stanford is one of several research centers in the Panasas customer base. Though the vendor has tried expanding its roster to enterprise businesses, it appears most at home in labs. (See Panasas Clusters at LSU, Panasas Powers Stanford Research, Lawrence Livermore Picks Panasas, Telstra Updates Q3 , and Panasas Completes Record H1.) Part of the reason is that labs use clusters, and clusters benefit from the Panasas approach.

And the clustered approach is catching on. Panasas execs' claim of 500 percent growth this year is reflected in the success of competitors, most notably Isilon Systems, which also claims substantial growth, albeit in more commercial deployments (see Top Ten Private Companies: Summer 2005, page 3). Ibrix lso has succeeded in labs with a software-only solution. (See Ibrix Demos Cluster Hugeness at CCR, NCSA Selects Ibrix, and Ibrix Touts Monster System.)

Mary Jander, Site Editor, Byte and Switch

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights