The National Science Foundation has awarded $13.65 million over five years to a consortium of 15 U.S. universities and four national laboratories that mean to create a global computing and storage network for major scientific experiments.
The only problem is, none of the commercially available technology, particularly in the storage world, is up to the job, according to Bill St. Arnaud, director of network operations for Canarie Inc. (the Canadian equivalent of Internet2), an organization devoted to futuristic networking.
The consortium, called the iVDGL, (International Virtual Data Grid Laboratory) will build its grid in partnership with the European Union, Japan, and Australia to form the world's first truly "global grid," for physics, astronomy, biology, and engineering research, iVDGL spokespeople say.
"Grid computing" is essentially a buzzword for clusters of servers joined together over the Internet. The idea is to allow organizations spread across geographies to share applications, data, and computing resources. They employ specially developed protocols provided by the open source community (Globus) and other open source technologies, like Linux.
Because of the distributed nature of the network, technologies like Fibre Channel that create isolated islands of storage are wholly inappropriate for grid environments, according to St. Arnaud. Many of the commercial products dont meet our requirements at all, he says. They are fine for banks and institutions, but we need an order of scale and complexity that far exceeds this.