March 31, 2015
Big Data Certifications: Finding The One That Works For You
Big Data Certifications: Finding The One That Works For You (Click image for larger view and slideshow.)
Last week it was Couchbase. On Tuesday, March 31, MongoDB pointed to third-party research that shows its product delivers superior performance to that of its rivals. Whose research can you believe?
As we reported last week, Couchbase started this database-performance claim war by offering research conducted by Avalon Consulting LLC -- clearly sponsored by Couchbase -- that shows the Couchbase NoSQL database management system beating MongoDB on multiple performance measures.
The key point of Avalon's whitepaper was that the matchup was against MongoDB 3.0, that vendor's latest release featuring the recently acquired Wired Tiger 3.0 storage engine. The new storage engine substantially improves that product's write performance and scalability, according to MongoDB, yet by Avalon's measures, Couchbase had higher throughput and concurrency in every test.
[Want more on this database performance flap? Read Couchbase Claims Performance Gains Against NoSQL Rivals.]
MongoDB naturally begged to differ with Avalon's findings, noting that the Couchbase configuration used in the test harnessed three times more hardware than the MongoDB configuration, while the latter deployment had an automatic-failover feature turned off, contrary to MongoDB best practices. "If MongoDB were configured comparably to Couchbase in these tests, the results would be dramatically different," stated Kelly Stirman, MongoDB's director of products, in a comment on that story.
The research sponsored and released by MongoDB on Tuesday was carried out by United Software Associates. It compared MongoDB to Cassandra and Couchbase. The report states that the test featured identical hardware for all three products and featured a Yahoo! Cloud Serving Benchmark test of insert, updata, and read performance.
Predictably, MongoDB won on every measure in United Software Associates' tests, including different workloads and measures of database throughput, durability, and balanced combinations of both. The key twist in this test is that in all cases it featured a single-database-server and a single-client-server, a configuration that hardly stresses scalability or the highly-distributed nature of typical NoSQL database deployments -- or at least those of Cassandra and Couchbase. For MongoDB, it's common to see single-server deployments, according to Stirman.
"Databases are often deployed on a single server, and we know that based on profiles of about 60,000 MongoDB deployments that we have access to via our cloud-management tool," said Stirman in a phone interview with InformationWeek. "When run in a distributed fashion, all of these systems are comprised of multiple, individual servers, so you have to start by looking at what a single server delivers [in terms of performance]."
Contrary to this suggestion, scaled-out performance -- much less scaled-out performance across multiple data centers -- is rarely a clear multiple of single-server performance. In fact, Stirman acknowledged that "it's harder to do an apples-to-apples comparison that way because these products scale out in very different ways."
While Avalon's research featured multi-server configurations and tested concurrency in excess of 500 simultaneous users, United's research tested a single database server and a single client server, with no mention of concurrency demands. On the other hand, Avalon's tests used very different hardware configurations for the two products tested, and MongoDB contends its deployment best practices were ignored.
MongoDB's sponsored research was covered by nondisclosure agreements at this writing, so we have to leave it to Cassandra promoter DataStax and Couchbase to share their take on tests in the comments area below. Suffice it to say that the most reliable tests of database performance are independently verified tests such as TCP benchmarks. Sponsored research invariably delivers exactly what the sponsors pay for: a winning result.
Even better than an abstract benchmark test like a TCP is a proof-of-concept test using your own data and your own anticipated workloads. Only this type of real-world testing will tell you how products will perform in your environment. In the bargain, your people will also gain experience with the features, security, manageability, and ease of development of the products. On this point we're in agreement with Stirman of MongoDB.
"There's a long list of things you should look at, and performance is part of that consideration," he said.
Attend Interop Las Vegas, the leading independent technology conference and expo series designed to inspire, inform, and connect the world's IT community. In 2015, look for all new programs, networking opportunities, and classes that will help you set your organization’s IT action plan. It happens April 27 to May 1. Register with Discount Code MPOIWK for $200 off Total Access & Conference Passes.
About the Author(s)
You May Also Like
Implementing Privacy by Design into Information Systems
Checklist: 7 Essentials for Securing Modern Applications
2023 Cloud Security Report
Checklist: Top 6 Considerations to Optimize Your Digital Acceleration Security Spend
Top Six Recommendations to Improve User Productivity with a Hybrid Architecture