Sun Bit by Benchmark Bug

Flaw in SPC benchmark annuls Sun's test of HDS 9980V array - giving EMC a reason to gloat

May 21, 2003

5 Min Read
Network Computing logo

Earlier this month, Sun Microsystems Inc. (Nasdaq: SUNW) -- and by extension, its partner Hitachi Data Systems (HDS) -- was set to claim it had the fastest storage system in the world, until it discovered that the performance benchmark test it was using had a critical flaw that invalidated the results.

Sun's initial testing, using the Storage Performance Council (SPC)'s benchmark, showed that its StorEdge 9980, a rebranded version of HDS's Lightning 9980V storage array, performed at 67,098.02 I/O operations per second (IOPS). That would have shattered 3PARdata Inc.'s previous record of 47,001.1 IOPS by more than 30 percent (see 3PAR Claims Benchmark Title).

But before Sun had a chance to officially flaunt the results, the company retracted them due to a flaw in the test itself, according to the SPC, a vendor consortium whose sole purpose is to maintain and promote storage performance benchmarks.

A bug in the SPC-1 test's workload generator caused Sun's results to be "noncompliant," says Walter Baker, an engineer with consulting firm Gradient Systems Inc., who is contracted as the SPC's administrator and auditor. "The generator that creates the I/O stream going across the specified storage did not cover the amount of storage they configured," he says, adding, "Sun didn't do anything wrong... They configured what they said they configured."

Sun is now busy preparing for a new benchmark test, which it claims will yield results as good as -- or better than -- the original ones. Nevertheless, the episode has stirred up renewed criticism of the SPC benchmark by EMC Corp. (NYSE: EMC), the test's biggest detractor."Sounds like a smoking gun to me," says Ken Steinhardt, EMC's director of technology analysis.

EMC is the only major enterprise storage vendor that's not carrying an SPC membership card. Since it withdrew from the organization in March 2000, the company has relentlessly bashed the group's efforts. "It has absolutely no value from a customer perspective," Steinhardt says of the benchmark. He says EMC is able to conduct much more meaningful performance tests at individual customer locations. "We can guarantee the kind of performance we can deliver in your environment," he says. "We look at the actual configurations."

Steinhardt denies that EMC has avoided testing its systems with the benchmark because they simply won't measure up. "I think our product speaks for itself -- and what we do on the customer floor speaks even louder."

But vendors that support the SPC see the benchmark as a valuable marketing tool. Several storage players, including 3PAR, Hewlett-Packard Co. (NYSE: HPQ), and LSI Logic Storage Systems Inc., have touted their results on previous tests (see HP EVA Breaks Record, HP Fiddles With Cache, and LSI Screams Past IBM, Sun).

Meanwhile, some end users say the SPC benchmark is of little value to them. "We find that no matter whose brand of storage we're using, we never stretch any of these things to the limit," says Bill Bender, technical manager in Lucent Technologies Inc.'s (NYSE: LU) IT organization. Lucent uses both EMC and Hitachi storage. "Most important for us is the after-sale support... No matter whose storage you buy, there will always be a time when things go boom in the night."Others say that, while benchmarks are a useful reference, they seldom factor directly into the purchasing decision. "Benchmarks have their place," says one systems architect for a large bank, who asked to remain anonymous. "But we always bring everything into our own labs and do it all over again."

The SPC's Baker, however, asserts the benchmark, which is based on traces from a broad range of customer platforms, is very valuable. "It's an I/O benchmark, and it represents an I/O stream," he says. In addition, he says, the organization is looking into expanding its repertoire with two additional benchmarks.

As for companies that can't match up their specific configurations to the benchmarked version, Baker says an increasing number of them are asking for the SPC benchmark kit so that the vendor can redo the test in their specific environments. "Over the past six to nine months, inquiries from end users have exceeded inquiries from vendors," he says, pointing out that they're mostly interested in finding out which vendors have posted results. "[EMC's] lack of participation will be noted."

Despite its flaws and limitations, other industry observers agree that the SPC's benchmark is more reliable and useful than the benchmarks vendors choose to run themselves.

"The SPC benchmark is really the only benchmark today that is vendor-neutral. I'm very skeptical of any vendor benchmark," says Randy Kerns, partner with Evaluator Group. "Customers should ask for performance guarantees... but this benchmark is a good indicator for you."The anonymous systems architect, however, questions just how independent the SPC test really is. "That independent lab is being paid by somebody," he says. "And who are they being paid by? They have to answer to their customers [the storage vendors]... That could skew the results."

That's a valid concern, concedes Baker, but he says the competing interests of the different vendors are meant to level the playing field. In addition, he points out, SPC's full-disclosure report allows end users to view exactly how the test was conducted and with what configuration. "We describe what they did as closely as possible," he says.

As for the problems encountered on the Sun test, Baker insists they don't call into question the validity of the test. "That we caught the error shows that our multistage review works. We want to ensure that the results are meaningful."

Eugénie Larson, Reporter, Byte and Switch

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox
More Insights