Sun And AMD: A Potent Combination?

So Sun is adding to its line of servers powered by the AMD Opteron chip. But how does its price/performance stack up against others, namely Dell?

September 13, 2004

5 Min Read
NetworkComputing logo in a gray background | NetworkComputing

The news that Sun Microsystems Inc. is adding to its line of servers powered by the AMD Opteron chip seems to indicate that the company is sold on the AMD product. Indeed, John Fanelli, senior director of marketing for the company, says that the whole idea is that Sun can offer great performance at a low price point.

"The AMD chip outperforms the Intel Xeon and Nokona chips," Fanelli says, so [these servers] build around that."

Fanelli notes there are three classes of such products, the Sunfire V20Z, which is available up to a 2-processor option, and the V40Z, with up to four-way processors, and two Sun java workstations, which have been recently announced.

"Opteron gives a better value story," he adds, explaining that it's good at scaling, and that it's backward-compatible with 32-bit applications. And, of course, there's that performance score, which Fanelli says is "world-record."

But Sun could have said something like that when it came out with servers powered by Intel chips. What's the difference here? Are the new servers really delivering on the promise?If you visit the Sun Website, you can find a page on which the company has links to SPEC CPU test results for the performance of one of these servers, the Sunfire V20Z. But before you go to that site and take a gander at those results, you need to know something about how the SPEC CPU benchmark suite works.

First of all, SPEC, the Standard Performance Evaluation Company, is a non-profit company whose membership comprises a veritable who's-who of chip and computer makers. The members provide talented engineers and programmers who spend part of their time developing benchmarks. You can find out a lot about how the latest CPU benchmark, CPU 2000, was developed by following this link.

In a nutshell, SPEC tests its benchmarks, a series of various compute-intensive applications, such as simulations, natural-language processing and graphic rendering, on a reference machine, and then normalizes the times for new machines tested with respect to the times for that reference machine. Well, really, SPEC runs the reference machine, and then members test their own machines and report the normalized results to SPEC, according to rules that SPEC promulgates.

Sun links to a SPEC Web page that offers results for the CFP2000 test, which is a component-based benchmark that exercises the CPU, memory and compilers on a system. SPEC itself says, in a FAQ on the site, that the CFP2000 test is NOT a test of the system. In particular, it does not exercise the networking capability of the system, and it does not stress I/O. Further, the results of this test will be very dependent on the size of memory installed in the test machine: More memory means less paging, and that means faster execution on large programs and large datasets.

A SPECfp score is a measure of the time required to run the particular floating-point benchmark as the only program executing. SPECfp_rate scores, on the other hand, measure the machine running in a multitasking mode. They will be lower than the SPECfp score. The SPECfp_rate_base score is one that has been attained using benchmark programs compiled with a conservative compiler setting, while the SPECfp_rate score comes about using an aggressive compiler. Higher numbers on all these scores are better.The results Sun cites show a SPECfp_rate2000 score of 37.2, and a SPECfp_rate_base score of 33.7. And, indeed, a quick scan of all published CPU2000 test results shows that these numbers are very high for a two-processor system.

How do these results compare with a similar computer system? That's hard to tell, because the details of the tested Sun system make it unique. But it has two Opteron 250 chips inside it, with one Opteron core/chip. It's also got 8 GB of memory, which means swapping will not be a problem. Something like that would cost quite a bit. Fanelli said that an entry V20Z with one CPU, 1 GB of memory and one disk drive would cost $2,795. While Sun doesn't post a price for an 8-GB V20Z system, like the one with the SPEC results, on its Web site, a machine with two processors and 4 GB would cost $6,995, without an operating system.

Just for a (not exact) comparison, Dell has system results posted on the SPEC site as well. This one is a PowerEdge 2850, with two 3.6-GHz Intel Xeon processors inside, and with 8 GB of memory. For this machine, the SPECfp_rate score is 24.9, and the SPEC_fp_rate_base score is 24.8. These scores are about 80 percent of those of the 8-GB SunFire V20Z.

If you go through Dell's pricing dialogs on the Website for a system with these specifications, you get a price of $5,276, which is about 75 percent of the cost of the 4-GB Sunfire V20Z. Both those prices are without the operating system.

These numbers are not conclusive, not least of all because the systems are not identical, although I tried to get comparable machines. I used the Dell as comparison because it seemed about the closest in the list of system results on the SPEC site. But it does seem clear that the Sun claims of more performance at a lower cost are not a slam-dunk, at least on this evidence. But remember, my comparison doesn't have an exact system-to-system equivalence, nor does it indicate price breaks that are always available. And, it doesn't include the cost of the operating system, which can be very significant.This exercise illustrates one salient point: You should take claims of price/performance "goodness" with a grain of salt. The real question is how the system will operate in your environment, and you can only find that out be getting a test system and giving it a whirl. If you're a very large organization, that shouldn't be too hard. For smaller shops, study the numbers as well as you can, but really try to get an apples-apples comparison, and try to get a sample in to work with under simulated real conditions. And take benchmark information as a first reference, a place where you can start to narrow your server choices.

David Gabel has been testing and writing about computers for more than 25 years.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights