The test showed PostgreSQL results that were about 12% lower than an Oracle database system running on a comparably priced HP Itanium system, an outcome that Josh Berkus, a core contributor to the PostgreSQL open source project, called "a good day for open source."
PostgreSQL has had a reputation as a feature rich and compute cycle-hungry system that is slower than Oracle and its open source competitor, MySQL.
The benchmark is not conclusive. Different hardware running different numbers of core processors was used in tests. The main thing the Oracle and PostgreSQL tests had in common was that they were run on comparably priced hardware, said Berkus. The results tend to offset PostgreSQL's reputation for poor performance, he claimed.
Sun distributes PostgreSQL with its Solaris 10 operating system. In a bid to show the value of the combination, it sponsored a test using the Standard Performance Evaluation's benchmark. The specific test was the SpecjAppServer 2004, which is meant to measure the performance of a Java application server and database combination on a particular piece of hardware.
Oracle used the same SpecjAppServer 2004 benchmark last December to boast of the performance of the Oracle 10g database and Oracle Application Server combination on a hardware cluster.
Surprisingly, three different application servers working with three different databases have produced surprisingly similar benchmark results in recent tests.
Measured in JOPS, or Java application server operations per second, the Oracle and Oracle Application Server combination yielded 874 operations per second. The benchmark was conducted by HP since it was run on its Itanium-based Integrity server.
The PostgreSQL 8.2/Java Application Server combo on Sun UltraSparc hardware yielded 778 operations per second.
The Oracle system had the edge in listed hardware value, $74,000, compared with $65,000 for the Sun hardware used for the PostgreSQL system.
Nevertheless, the tests are the most comparable run so far under the SPEC benchmark and show only a 12% performance margin for Oracle, said Berkus. He was involved in formatting the benchmark according to SPEC guidelines over the last six months. He said Sun, Unisys, and NTT contributed expertise to the benchmark.
A third test sponsored by Sun, using BEA WebLogic with IBM DB2, yielded 802 JOPS, running on similar UltraSparc hardware.
SPEC benchmarks must follow set rules and guidelines in how they are conducted and are subject to review before the results can be published. The SPEC organization nevertheless says each vendor is responsible for the accuracy of the benchmark.
Berkus said the publication of the results for PostgreSQL had the potential to "influence corporate software buyers," although few have warmed to the PostgreSQL system in the past.
In a July 9 blog post on the benchmark, Berkus conceded that PostgreSQL has "a reputation for sluggish performance. ... This publication [of benchmark results] shows that a properly tuned PostgreSQL is not only as fast or faster than MySQL, but almost as fast as Oracle."
Berkus said neither MySQL or PostgreSQL have been subject to many benchmark tests that allow them to be compared to proprietary systems because of the expense of conducting the benchmarks. A real head-to-head comparison of database performance will probably require use of the TPC benchmark, he added. SPEC benchmarks measure more than the database, but for now, that's the most direct comparison available.