Oak Ridge Plans Petaflop Supercomputer

Oak Ridge National Lab reveals plans for the world's largest supercomputer

September 28, 2005

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

OAK RIDGE, Tenn. Oak Ridge National Laboratory, birthplace of the atomic bomb, will launch a petaflop-sized supercomputer by the end of the decade to cope with spiraling demands for computing power at the high-profile research site.

Speaking at the High Performance Computing User Forum at Oak Ridge today, Thomas Zacharia, the lab’s associate director, said the site’s current systems, which have a capacity of around 43 teraflops, are being stretched to the limit. The scientist told NDCF that that the lab is “constrained” by the size of these machines, particularly when it comes to the atom simulations needed to support its energy research.

A petaflop is equal to a thousand trillion operations per second. This is three orders of magnitude more powerful than a teraflop, which is a trillion operations per second.

The new, as yet unnamed, supercomputer will blow the doors off other high-performance systems. The top supercomputer in the world is currently a 136.8-teraflop IBM Corp. (NYSE: IBM) Blue Gene/L system at the Lawrence Livermore National Laboratory. (See IBM Dominates Supercomputing and Invasion of the Coneheads.)

Zacharia adds that Oak Ridge’s technology needs are growing right across its research, from climate change through to astrophysics. “Every time we solve a bottleneck, another bottleneck comes up,” he says. “Our scientists need more resources.”Zacharia did not reveal many specifics of the new supercomputer, although he did confirm that it will include both vector and scalar processors. Vector processors are specially designed to handle complex mathematical operations simultaneously. Scalar processors, however, handle just one data item at a time. A number of vendors, including Fujitsu Ltd. (OTC: FJTSY; Tokyo: 6702), Cray Inc. (Nasdaq: CRAY), and IBM Corp. (NYSE: IBM) offer this type of technology.

At the moment, Oak Ridge uses a mix of different IT systems to power its research, including a Silicon Graphics Inc. (SGI) (NYSE: SGI) Altix machine, X1E and XT3 devices from Cray, a BlueGene/L supercomputer from IBM, and smaller clusters.

Zacharia told NDCF that the petaflop system will be a “single supercomputer,” although he insists it will rely on a “heterogeneous” infrastructure, which suggests that a range of vendors could be involved in building it. “Right now we’re talking with everyone."

Oak Ridge, however, will need some serious storage to support the new system. “The petaflop machine we’re looking to build will have a petabyte of memory,” predicts Zacharia, adding that around 40,000 disks will be used for primary storage.

A system on this scale also will need plenty of power. Zacharia estimates that the new supercomputer will require around 40 megawatts, more than quadruple the 8 megawatts currently consumed by Oak Ridge’s systems. “Power consumption is a key attribute of a petascale computer. It’s not likely to be a computer that you can buy and stick it in a wall.”Such is the scale of this challenge that the lab is looking beyond data centers to see how other major scientific installations deal with power. Zacharia, for example, recently visited a nuclear reactor in Haifa, Israel, to find out more about power-hungry systems.

But the director feels that Oak Ridge will not have a problem feeding its new supercomputer. “The Tennessee Valley has abundant power. We have two power grids coming into the site.”

The lab, which has an annual budget of over $1 billion to support its research, is currently expanding its systems while it plans the new supercomputer. This will boost capacity to 100 teraflops next year and 250 teraflops in 2007, says Zacharia.

Oak Ridge is getting more involved in supercomputing at a time when many users are going in the opposite direction, replacing high-end systems with clusters made from standard pieces of hardware. (See Sandia Blasts Off Blade Cluster and JP Morgan Goes Grid.) But Zacharia feels that supercomputers are the best fit for many of the lab’s tasks. “There’s a class of computational science that can only be done by tightly integrated, purpose-built [systems]."

Oak Ridge is not the only organization looking to radically grow its computing needs. Steven Meacham, high-performance computing director at the National Science Foundation, a U.S Government body that supports scientific research, explained that the organization is looking to start work on its own high-end system in 2007. “The idea will be to have one petascale system that’s in place for science and research by 2010.”— James Rogers, Site Editor, Next-Gen Data Center Forum

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights