Oracle Database In-Memory comes with big performance promises. But examine a few performance tradeoffs, change requirements, and acceleration constraints.
Oracle CEO Larry Ellison on Tuesday promised performance gains without compromise with Oracle Database In-Memory, an option set for general release in July. But in a familiar pattern for in-memory technology, the claims must be accompanied by a few important caveats.
Oracle's key claims about the in-memory option are that it will deliver dramatic analytical performance improvements and unexpected transactional speed gains all without requiring changes to existing applications. It may be possible to achieve all of these goals independently, but as an Oracle executive acknowledged in a follow-up interview after Tuesday's presentation, performance improvements will vary and in some cases may require changes to applications.
Here are the 6 crucial facts about the option along with explanations from Oracle and comments from two in-memory competitors.
1. BI and analytical apps many not work properly if you get rid of analytical indexes. In Tuesday's announcements about Oracle Database In-Memory, Ellison promised "at least 100 times" faster analytical performance thanks to the in-memory columnar data store introduced by this option. Transactional performance is only two-to-four-times faster, according to Oracle, because the row store -- the heart of the database that runs transactional applications -- is unchanged. That data remains on spinning disks.
In advance of the announcement, in-memory competitors SAP and VoltDB questioned whether transactional performance can improve at all, given that the database will now have to sync data between the row and new columnar stores. But Ellison said Oracle's trick in achieving both goals is eliminating analytical indexes no longer required once customer-selected tables and partitions are placed in memory. The question is, how can existing BI and analytical applications designed to work with those indexes run without changes?
"The optimizer on 12c has been enhanced to be aware of the in-memory store, and if it works correctly, those BI queries will be redirected," explained Tim Shetler, Oracle's group VP, product management. The important caveat? "It's pretty easy to forget all the applications that depend on indexes, so we're giving cautious guidance to customers to test before eliminating indexes."
Customers can test with a database setting than can temporary hide indexes without deleting them. "Pick an index, run for a week, and if important applications run slowly, maybe that index needs to stay or you need to do something differently."
Transactional performance can only improve if you eliminate analytical indexes. But the more of these indexes you remove, the more likely it is that you'll need to rewrite BI and analytical apps to run on the in-memory option.
2. Transactional performance might pale in comparison with in-memory rivals. In contrast to Oracle, Microsoft is focusing strictly on transactional performance with its Microsoft SQL Server In-Memory OLTP option. Microsoft cites low-latency transactional demand from telcos, gaming companies, financial services, and online retailers. Gaming company Bwin.party, for example, reported 10 times to 15 times faster throughput using Microsoft's option for applications that were formerly disk-I/O-bound.
According to SAP, another proponent of in-memory transactional acceleration, Oracle "has not addressed the I/O issues associated with data on disk," said Irfan Khan, senior VP and general manager of SAP Database & Technology, in an email interview with InformationWeek. "While it may remove some maintenance of certain analytic indexes, it adds new overhead keeping two version of data in sync across the row store and columnar cache."
If transactional acceleration is needed, Oracle does offer the Oracle TimesTen in-memory database, Shetler pointed out. "But transactional applications represent a relatively small market opportunity," he asserted. "It's a little bit confusing why Microsoft has focused on transactions as their key point because there's so much [going on in] analytics now."
That's certainly true, but columnar databases geared to analytics "have been in the market for several years now," said Ryan Betts, CTO of VoltDB in an email interview. "In-memory is not only for analytics as Oracle would have you believe." VoltDB's transaction-oriented in-memory database is used in networking, gaming, financial services, and online marketing and retailing applications.
3. Dramatic acceleration will require a little work. The good news is that all applications that currently run on Oracle will run on and see performance improvement with Oracle Database In-Memory. As for all those dramatic performance gains cited during Tuesday's presentation -- queries going from nearly 4 hours down to 4 seconds or 58 hours down 13 minutes? That might take a little work, but it's work that will be done primarily by software vendors.
"Those speedups on Oracle applications weren't achived just by deploying the in-memory option," clarified Shetler of Oracle, noting that Oracle's apps teams were part of the beta program and learned how to exploit the option. "When they modified the apps to take advantage of the in-memory technology, that's when they got the levels of performance improvement shared in the presentation."
In more good news, many applications, including some Oracle applications, have yet to be certified on 12c, so adaptations to the in-memory option can and will be carried out by software vendors. Customers will see what Shetler described as "modest" performance improvements on current applications without any changes. But with upgrades to 12c-certified apps over the next couple of years, customers will be able to take full advantage
Doug Henschen is Executive Editor of InformationWeek, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of ... View Full Bio
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
. We've got a management crisis right now, and we've also got an engagement crisis. Could the two be linked? Tune in for the next installment of IT Life Radio, Wednesday May 20th at 3PM ET to find out.