Power Up With Utility Computing

IBM realizes the benefits of appealing to smaller companies with a useful on-demand model to sell services, build loyalty, and get an edge in functionality.

August 29, 2005

7 Min Read
Network Computing logo

Startup QuantumBio Inc., a provider of software tools for drug, biotechnology, and pharmaceutical companies, is too little a business to pay a small fortune for an IBM Blue Gene supercomputer. But that doesn't mean QuantumBio can't offer its customers access to a piece of the same power that outfits like the U.S. Department of Energy enjoy. With the launch of IBM's fourth Deep Computing Capacity On Demand Center in March in Rochester, Minn., the door was opened for QuantumBio to take advantage of 5.7 teraflops--that's 5.7 trillion calculations per second--of Blue Gene's power.

Founded a little more than three years ago, QuantumBio is in a revenue-building stage and has limited internal computing capacity, says Lance Westerhoff, chief software engineer. Installing a Blue Gene system in-house to host its customers' drug simulations would have startup costs of around $2 million, plus there would be expenses associated with hiring people who have the skills needed to run the system and other maintenance costs. So instead, QuantumBio has IBM host its applications on the Blue Gene supercomputer, which IBM offers to its on-demand customers at a beginning price of $10,000 per week. QuantumBio sells its apps in a utility model.

A highly secure VPN provides QuantumBio customers with access to the utility. "Maybe the most intangible benefit is we could say to our customers, 'You're not working with a small company like QuantumBio to maintain your security, you're actually working with IBM,'" Westerhoff says.

Rather than buy an IBM Blue Gene supercomputer, companies can use the computer's power on an on-demand basis.

IBM began its computing-on-demand effort in earnest a little over two years ago. It's an investment that the company hopes will put it into a leadership position in this emerging computing phenomenon. IBM also wants its centers to serve as a platform for customer innovation that requires IBM technology.

"It's an efficient model for us and an economical model for our clients," says Rebecca Austen, director of deep-computing marketing for IBM. "Deep computing on demand provides a new financial solution for cost of capital, cost of space, cost of power and cooling, and cost of administrative staff."IBM operates four regional centers, where it has sold about 10 million hours of computing time to about two dozen active customers and others that have been experimenting with the technology, says David Gelardi, VP of deep computing capacity on demand for IBM. The centers offer customers access to clusters of computers based on Xeon processors from Intel, Opteron processors from Advanced Micro Devices Inc., and its own Power processors, as well as access to the latest Blue Gene supercomputer installation.

Its first center opened in Poughkeepsie, N.Y., in June 2003 and now hosts more than 4,000 Xeon rack-mounted nodes, 240 Xeon blade-server nodes, and nearly 200 Opteron nodes. Its Montpellier, France, center has 500 Xeon nodes, and its Rochester, Minn., Blue Gene facility offers more than 2,000 CPU nodes.

IBM also has established a partnership with VeriCenter Inc., an independent provider of computing capacity, to offer more than 1,000 nodes of Xeon processor capacity out of VeriCenter's Houston offices, which are used primarily by local companies in the oil and gas industry. IBM counts this as one of its four on-demand computing installations.

The on-demand centers are located near fairly densely populated areas with access to high-bandwidth metropolitan area networks that let customers tap into the IBM computing resources from their offices. But IBM plans to open other centers as demand increases. This story was updated Aug. 30.And it will increase, predicts Charles King, an analyst with Pund-IT Research. So far, IBM hasn't turned a profit on any of its centers. But on demand isn't unlike the Internet-based E-commerce industry of about 10 years ago, King says. "At that time, there was a lot of debate about whether E-business was a goal worth pursuing," he says. "The proof came as people found that E-business allowed them to save money in operational costs, operate more efficiently, and even open up new business opportunities. The same kind of opportunity exists now for utility computing."

For Exa Corp., a provider of fluid flow-simulation software, access to IBM's on-demand capacity has allowed it to address large markets, such as the automobile industry, both quickly and economically."It takes our compute costs down by a factor of five to 10," says Stephen Remondi, Exa's president and CEO. Of hosting its customers applications in IBM's on-demand centers, Remondi says, "We can say yes to anything because we're basically leveraging IBM's balance sheet. They can amortize the cost of thousands of processors over many large customers, where I would have to amortize over a much smaller base."

Access to on-demand capacity has reduced Exa's computing costs by a factor of five, CEO Remondi says.

Exa has a contract at an undisclosed cost with IBM for millions of CPU hours per year, which it taps into at the Poughkeepsie facility. Exa can order specific workloads in increments of 128 processing nodes. Say an automotive manufacturer has 15 or more potential simulations it would like to test over a short period. Exa can call IBM on a Friday afternoon and contract for a few hundred or more processor nodes, run the simulations over a weekend, and disengage on a Monday, Remondi says.

IBM's next on-demand moves may lead it to open capacity-on-demand centers in areas where there's a concentration of clients in a particular industry. IBM has been in discussions with several industry consortia about establishing centers to meet specific needs, Austen says. The consortia are looking to create computing resources that could be dedicated to their use, but configured so that multiple members can tap into the computational power available.

At the same time, IBM itself plans increasingly to use its computing-on-demand centers "as a way to open doors to new technologies that may not be fully proven out in real-world applications," Austen says.

One example is with the Cell processor, which was developed by IBM, Toshiba, and Sony, with its initial use as an engine for Sony's next-generation PlayStation gaming console. The device combines a PowerPC processor core with eight smaller processing elements. In June, Mercury Computer Systems Inc., a developer of board-level computing systems, revealed plans to create Cell-based systems for use in imaging-intensive markets, and IBM believes there could be many new applications for Cell, including use in serverlike clustered systems. Austen envisions that clusters of Cell processors could be used to create applications for security surveillance, digital media, and image processing.To encourage exploration by vendors, IBM located its Cell simulator in Poughkeepsie. Customers can access the simulator to experiment with the Cell processor and determine if it would be suitable for their use and in what kind of configuration, at no cost. The benefit to IBM is that it helps the vendor determine what types of systems could be created based on the new processor technology and how potential utility customers would take advantage.

Another emerging technology available on demand is IBM's Deep Computing Visualization, which was introduced in February. The technology, available at IBM's Watson Research Center in Yorktown Heights, N.Y., and the Poughkeepsie center, uses IBM IntelliStation workstations to create enhanced graphic functions in two visualization modes, Scalable Visual Networking to increase screen resolution and image size, and Remote Visual Networking to allow remote use of the application, which can bring "visualization down to the masses," Austen says. CAD engineers use it to provide more-accurate representations of their work.

Although IBM has no expectation that businesses will soon be moving to full utility-style environments for all their computing needs, the market has started to evolve. The possibilities of where it can go next are nearly infinite.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights