Some say they are abstract if not downright misleading. But TPC benchmarks remain the only available standardized test of total-system performance and cost. Use them with caution.
If you review the latest TPC-H results, for example, you'll see plenty of tests featuring IBM, Oracle and Microsoft databases (as well as a few independents), but you won't see any tests from Teradata, Netezza and a few other prominent member companies (which I take to be in the "customers know us" camp Majdalany described).
In general, tests unrealistically skewed to highlight performance will end up looking bad in terms of cost, and vice versa. This is an issue to keep in mind when comparing benchmarks to your own would-be configurations. The details are all there, complete with pricing, capacities and much more when you download the executive summaries and full reports.
According to the TPC, more than 30,000 unique visitors accessed and/or downloaded TPC-C test results and 13,000 accessed or downloaded TPC-H test results in May alone. And in a sign that TPC checks and balances just might work, the test that sparked Monash's complaint was challenged by another vendor and, according to this report, subsequently withdrawn.
I recently wrote that "TPC benchmarks and claims to be X times faster than such and such competitor are irrelevant. Customers want references from other companies that fit their description and that have similar challenges."
That comment prompted a call from the TPC and an invitation to speak to Majdalany and member Meikel Poess, the former chairman of the TPC-H subcommittee and a principal developer at Oracle.
I learned a lot about the TPC in that conversation, not the least of which is that it is a very public and open organization that's a vehicle for disseminating plenty of valuable information. While you often hear TPC stats quoted in hype-filled press releases, I wasn't fully aware of all the detailed pricing and performance data available when you download individual test reports. I also learned that the TPC-H benchmark is currently under review, with hopes to introduce an improved 3.0 benchmark including tests of higher-scale deployments and increased numbers of users.
I remain convinced that the best way for would-be database buyers to choose the right product is to ask vendors for proof-of-concept deployments geared to their specific needs. But as a starting point in your research -- and with a discerning eye on capacities and the specifics of the configurations -- it can't hurt to look at the results of uniform performance tests and standardized statistics on cost and power consumption.
The Agile ArchiveWhen it comes to managing data, don’t look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyIT’s tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
Join InformationWeek’s Lorna Garey and Mike Healey, president of Yeoman Technology Group, an engineering and research firm focused on maximizing technology investments, to discuss the right way to go digital.