Some say they are abstract if not downright misleading. But TPC benchmarks remain the only available standardized test of total-system performance and cost. Use them with caution.

Doug Henschen, Executive Editor, Enterprise Apps

September 3, 2010

2 Min Read

If you review the latest TPC-H results, for example, you'll see plenty of tests featuring IBM, Oracle and Microsoft databases (as well as a few independents), but you won't see any tests from Teradata, Netezza and a few other prominent member companies (which I take to be in the "customers know us" camp Majdalany described).

Vendors can also test configurations that make them look good, Majdalany acknowledges, either on the "Top Ten Results by Performance" chart or on the "Top Ten Results by Price/Performance" listing.

In general, tests unrealistically skewed to highlight performance will end up looking bad in terms of cost, and vice versa. This is an issue to keep in mind when comparing benchmarks to your own would-be configurations. The details are all there, complete with pricing, capacities and much more when you download the executive summaries and full reports.

According to the TPC, more than 30,000 unique visitors accessed and/or downloaded TPC-C test results and 13,000 accessed or downloaded TPC-H test results in May alone. And in a sign that TPC checks and balances just might work, the test that sparked Monash's complaint was challenged by another vendor and, according to this report, subsequently withdrawn.

I recently wrote that "TPC benchmarks and claims to be X times faster than such and such competitor are irrelevant. Customers want references from other companies that fit their description and that have similar challenges."

That comment prompted a call from the TPC and an invitation to speak to Majdalany and member Meikel Poess, the former chairman of the TPC-H subcommittee and a principal developer at Oracle.

I learned a lot about the TPC in that conversation, not the least of which is that it is a very public and open organization that's a vehicle for disseminating plenty of valuable information. While you often hear TPC stats quoted in hype-filled press releases, I wasn't fully aware of all the detailed pricing and performance data available when you download individual test reports. I also learned that the TPC-H benchmark is currently under review, with hopes to introduce an improved 3.0 benchmark including tests of higher-scale deployments and increased numbers of users.

I remain convinced that the best way for would-be database buyers to choose the right product is to ask vendors for proof-of-concept deployments geared to their specific needs. But as a starting point in your research -- and with a discerning eye on capacities and the specifics of the configurations -- it can't hurt to look at the results of uniform performance tests and standardized statistics on cost and power consumption.

About the Author(s)

Doug Henschen

Executive Editor, Enterprise Apps

Doug Henschen is Executive Editor of InformationWeek, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of Transform Magazine, and Executive Editor at DM News. He has covered IT and data-driven marketing for more than 15 years.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights