Although it's been around since 1988 to define transaction processing and database benchmarks, as well as to disseminate objective, verifiable TPC performance data to the industry, the Transaction Processing Performance Council will be holding only its third conference to discuss the current state of benchmarking and keep pace with technological changes. The day-long event, co-located with the International Conference on Very Large Data Bases (VLDB) in Seattle on Aug. 29, will focus on a number of topics, including some that Oracle co-chair Meikel Poess calls exciting.
"When we sent out the call for papers, we didn't focus on specific areas. We were looking for new ideas." He and the other co-chair, Cisco's Raghunath Nambiar, had a list of items that were relatively broad, like virtualization, tools and energy efficiency.
A total of 26 papers were received, with each paper reviewed by four members of of the program committee. In addition to several members from academia, the committee consisted of a who's who from the vendor community, including representatives from Dell, Google, HP, Microsoft and SAP. A dozen papers were accepted for presentation at the conference, but the three that stood out for Poess were: Metrics for Measuring the Performance of the Mixed Workload, Normalization in a Mixed OLTP and OLAP Workload Scenario, and Extending TPC-E to Measure Availability in Database Systems.
According to TPC's website, benchmarking first came of age with the release of TPC-A in November 1989. TPC-A expanded on the DebitCredit benchmark. The first results were announced in July 1990 and, by 1994, 33 companies were publishing on TPC benchmarks and 115 different systems had published TPC-A results. The initial results were significant. The first TPC-A result was 33 tpsA at a cost of $25,500 per transaction, or tpsA, while the best result was $4,873 per tpsA, a difference of 111 times.
The second benchmark, TPC-B, was the batch version of DebitCredit, without the network and user interaction (terminals) figured into the workload. The first results were published in mid-1991. The top ratings increased by 19 times, and the price-performance rating improved by a factor of 16. During its 22-year history TPC has developed nine standards and guidelines, and it has three more benchmarking standards under development: DS (Decision Support), ETL (Extract, Transform and Load) and V (Virtualization), says Nambiar.
Benchmarking is important because it gives people a readily available tool to measure performance, says Poess. Users need an objective way of measuring performance to make informed buying decisions, while vendors need something to differentiate themselves to an increasingly sophisticated base of users. It makes an open competition possible, with the ability to do an apple-to-apple comparison, he adds.
See more on this topic by subscribing to Network Computing Pro Reports The Data Mastery Imperative (subscription required).