Open Source PostgreSQL Trails Oracle In Benchmark, But Not By Much - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Software // Enterprise Applications
News
7/17/2007
11:20 PM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

Open Source PostgreSQL Trails Oracle In Benchmark, But Not By Much

The test showed open source PostgreSQL results that were about 12% lower than an Oracle database system, which one backer called "a good day for open source."

Sun Microsystems has published a test of the PostgreSQL system running on its UltraSparc Niagara hardware, producing one of the few high-end server benchmarks available on the open source PostgreSQL database.

The test showed PostgreSQL results that were about 12% lower than an Oracle database system running on a comparably priced HP Itanium system, an outcome that Josh Berkus, a core contributor to the PostgreSQL open source project, called "a good day for open source."

PostgreSQL has had a reputation as a feature rich and compute cycle-hungry system that is slower than Oracle and its open source competitor, MySQL.

The benchmark is not conclusive. Different hardware running different numbers of core processors was used in tests. The main thing the Oracle and PostgreSQL tests had in common was that they were run on comparably priced hardware, said Berkus. The results tend to offset PostgreSQL's reputation for poor performance, he claimed.

Sun distributes PostgreSQL with its Solaris 10 operating system. In a bid to show the value of the combination, it sponsored a test using the Standard Performance Evaluation's benchmark. The specific test was the SpecjAppServer 2004, which is meant to measure the performance of a Java application server and database combination on a particular piece of hardware.

Oracle used the same SpecjAppServer 2004 benchmark last December to boast of the performance of the Oracle 10g database and Oracle Application Server combination on a hardware cluster.

Surprisingly, three different application servers working with three different databases have produced surprisingly similar benchmark results in recent tests.

Measured in JOPS, or Java application server operations per second, the Oracle and Oracle Application Server combination yielded 874 operations per second. The benchmark was conducted by HP since it was run on its Itanium-based Integrity server.

The PostgreSQL 8.2/Java Application Server combo on Sun UltraSparc hardware yielded 778 operations per second.

The Oracle system had the edge in listed hardware value, $74,000, compared with $65,000 for the Sun hardware used for the PostgreSQL system.

Nevertheless, the tests are the most comparable run so far under the SPEC benchmark and show only a 12% performance margin for Oracle, said Berkus. He was involved in formatting the benchmark according to SPEC guidelines over the last six months. He said Sun, Unisys, and NTT contributed expertise to the benchmark.

A third test sponsored by Sun, using BEA WebLogic with IBM DB2, yielded 802 JOPS, running on similar UltraSparc hardware.

SPEC benchmarks must follow set rules and guidelines in how they are conducted and are subject to review before the results can be published. The SPEC organization nevertheless says each vendor is responsible for the accuracy of the benchmark.

Berkus said the publication of the results for PostgreSQL had the potential to "influence corporate software buyers," although few have warmed to the PostgreSQL system in the past.

In a July 9 blog post on the benchmark, Berkus conceded that PostgreSQL has "a reputation for sluggish performance. ... This publication [of benchmark results] shows that a properly tuned PostgreSQL is not only as fast or faster than MySQL, but almost as fast as Oracle."

Berkus said neither MySQL or PostgreSQL have been subject to many benchmark tests that allow them to be compared to proprietary systems because of the expense of conducting the benchmarks. A real head-to-head comparison of database performance will probably require use of the TPC benchmark, he added. SPEC benchmarks measure more than the database, but for now, that's the most direct comparison available.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Slideshows
How to Land a Job in Cloud Computing
Cynthia Harvey, Freelance Journalist, InformationWeek,  6/19/2019
Commentary
How to Convince Wary Customers to Share Personal Information
John Edwards, Technology Journalist & Author,  6/17/2019
Commentary
The Art and Science of Robot Wrangling in the AI Era
Guest Commentary, Guest Commentary,  6/11/2019
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
A New World of IT Management in 2019
This IT Trend Report highlights how several years of developments in technology and business strategies have led to a subsequent wave of changes in the role of an IT organization, how CIOs and other IT leaders approach management, in addition to the jobs of many IT professionals up and down the org chart.
Slideshows
Flash Poll