Open Source PostgreSQL Trails Oracle In Benchmark, But Not By Much - InformationWeek
Software // Enterprise Applications
11:20 PM
Connect Directly
Ransomware: Latest Developments & How to Defend Against Them
Nov 01, 2017
Ransomware is one of the fastest growing types of malware, and new breeds that escalate quickly ar ...Read More>>

Open Source PostgreSQL Trails Oracle In Benchmark, But Not By Much

The test showed open source PostgreSQL results that were about 12% lower than an Oracle database system, which one backer called "a good day for open source."

Sun Microsystems has published a test of the PostgreSQL system running on its UltraSparc Niagara hardware, producing one of the few high-end server benchmarks available on the open source PostgreSQL database.

The test showed PostgreSQL results that were about 12% lower than an Oracle database system running on a comparably priced HP Itanium system, an outcome that Josh Berkus, a core contributor to the PostgreSQL open source project, called "a good day for open source."

PostgreSQL has had a reputation as a feature rich and compute cycle-hungry system that is slower than Oracle and its open source competitor, MySQL.

The benchmark is not conclusive. Different hardware running different numbers of core processors was used in tests. The main thing the Oracle and PostgreSQL tests had in common was that they were run on comparably priced hardware, said Berkus. The results tend to offset PostgreSQL's reputation for poor performance, he claimed.

Sun distributes PostgreSQL with its Solaris 10 operating system. In a bid to show the value of the combination, it sponsored a test using the Standard Performance Evaluation's benchmark. The specific test was the SpecjAppServer 2004, which is meant to measure the performance of a Java application server and database combination on a particular piece of hardware.

Oracle used the same SpecjAppServer 2004 benchmark last December to boast of the performance of the Oracle 10g database and Oracle Application Server combination on a hardware cluster.

Surprisingly, three different application servers working with three different databases have produced surprisingly similar benchmark results in recent tests.

Measured in JOPS, or Java application server operations per second, the Oracle and Oracle Application Server combination yielded 874 operations per second. The benchmark was conducted by HP since it was run on its Itanium-based Integrity server.

The PostgreSQL 8.2/Java Application Server combo on Sun UltraSparc hardware yielded 778 operations per second.

The Oracle system had the edge in listed hardware value, $74,000, compared with $65,000 for the Sun hardware used for the PostgreSQL system.

Nevertheless, the tests are the most comparable run so far under the SPEC benchmark and show only a 12% performance margin for Oracle, said Berkus. He was involved in formatting the benchmark according to SPEC guidelines over the last six months. He said Sun, Unisys, and NTT contributed expertise to the benchmark.

A third test sponsored by Sun, using BEA WebLogic with IBM DB2, yielded 802 JOPS, running on similar UltraSparc hardware.

SPEC benchmarks must follow set rules and guidelines in how they are conducted and are subject to review before the results can be published. The SPEC organization nevertheless says each vendor is responsible for the accuracy of the benchmark.

Berkus said the publication of the results for PostgreSQL had the potential to "influence corporate software buyers," although few have warmed to the PostgreSQL system in the past.

In a July 9 blog post on the benchmark, Berkus conceded that PostgreSQL has "a reputation for sluggish performance. ... This publication [of benchmark results] shows that a properly tuned PostgreSQL is not only as fast or faster than MySQL, but almost as fast as Oracle."

Berkus said neither MySQL or PostgreSQL have been subject to many benchmark tests that allow them to be compared to proprietary systems because of the expense of conducting the benchmarks. A real head-to-head comparison of database performance will probably require use of the TPC benchmark, he added. SPEC benchmarks measure more than the database, but for now, that's the most direct comparison available.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
[Interop ITX 2017] State Of DevOps Report
[Interop ITX 2017] State Of DevOps Report
The DevOps movement brings application development and infrastructure operations together to increase efficiency and deploy applications more quickly. But embracing DevOps means making significant cultural, organizational, and technological changes. This research report will examine how and why IT organizations are adopting DevOps methodologies, the effects on their staff and processes, and the tools they are utilizing for the best results.
Register for InformationWeek Newsletters
White Papers
Current Issue
2017 State of IT Report
In today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll