3 min read

IBM Calls Out Oracle On Server And Systems Claims

Do benchmarks get even less meaningful in an era of engineered systems optimized for specific tasks?
Which optimized/engineered system will run faster? You could accept vendor claims on face value, wait for TCP-C (transactional) benchmarks, or demand some level of proof-of-concept testing (and here, vendors should be going all out to use the cloud as a sandbox environment).

As for those two industry standard measures that Oracle did offer, a T4-based system did outperform an IBM Power 7 system on a SpecJ benchmark. Spang acknowledged as much, but he said Oracle failed to disclose that their platform needed a lot more hardware to deliver better performance. The system Oracle used for the SpecJ included twice the number of app server cores, twice the number of database cores, four times the amount of memory, 48 additional cores in the storage tier for I/O, eight times the amount of disk cache, and eight times more database storage than used by IBM's Power 7-based system for the same test, according to an IBM report.

"Based on our calculations, the T4 chips narrow the gap in performance against the Power 7, but they still have not caught up to a product that has been available for more than a year," Spang said.

As I reported last month, some analysts believe the T4 will help Oracle stem steady erosion in Sun server marketshare that began long before Oracle acquired the hardware manufacturer in 2009. Sun has some 50,000 customers, and the T4 and a faster T5 chip expected next year are crucial to retaining those customers.

IBM has been the chief beneficiary of Sun's slide in the high-end Unix server market, with the company's server business growing 22% in the first quarter of this year. IBM Power 7 hardware replaced more than 560 Oracle Sun deployments in the first six months of 2011, Spang said.

[ Want more on Oracle OpenWorld? Read 5 High Points And Low Points At Oracle Open World . ]

There's no doubt the bigger story where these Sun and IBM proprietary boxes is concerned is just how many of them are being replaced by Intel X86-based servers. In fact, last week at OpenWorld I talked to more than a few Oracle customers who are moving off of high-end Sun boxes and onto Exadata, Oracle's X86-powered database machine.

Whether it's Power or Sparc, it's nice to have independent benchmarks with all the facts of the configurations available. But as the layers of database, middleware and applications are added to the stack (or, in Oracle's case, offered as a ready-to-install add on), it's going to be up to you to justify the technology selection.

Are you choosing it because it's all available from a single vendor? That's not good enough. Are you choosing it because the application software best meets your needs? That's closer, but now you have to justify everything else that goes with it, and that gets down to price (of the total system) and performance -- as well as whatever convenience and efficiencies you expect with single-vendor support.

My advice? Do pilot tests with your data or a best-possible approximation of your workloads. Only then will you have an accurate sense of what's really entailed in deployment and configuration, and what to expect in terms of real performance. Anything will run faster on the latest generation of servers from any hardware vendor. So don't just accept an "X" times faster performance claim at face value.