Here are three surefire facts you need to consider about in-memory databases, along with one hard question those facts should lead you to ask.
First, databases that take advantage of in-memory processing really do deliver the fastest data-retrieval speeds available today, which is enticing to companies struggling with high-scale online transactions or timely forecasting and planning.
Second, though disk-based storage is still the enterprise standard, the price of RAM has been declining steadily, so memory-intensive architectures will eventually replace slow, mechanical spinning disks.
Third, a vendor war is breaking out to convert companies to these in-memory systems. That war pits SAP -- application giant but database newbie -- against database incumbents IBM, Microsoft, and Oracle, which have big in-memory plans of their own.
Which leads to the question for would-be customers: So what if you can run a query or process transactions 10 times, 20 times, even 100 times faster than before? What will that difference do for your business? The answer will determine whether in-memory databases are a specialty tool for only a few unique use cases, or a platform upon which all of your enterprise IT runs.
Pioneer companies are finding compelling use cases. In-memory capabilities have let the online gaming company Bwin.party go from supporting 12,000 bets per second to supporting up to 150,000. That's money in the bank. For the retail services company Edgenet, in-memory technology has brought near-real-time insight into product availability for customers of AutoZone, Home Depot, and Lowe's. That translates into fewer wasted trips and higher customer satisfaction.
SAP's Hana in-memory database lets ConAgra test new approaches for material forecasting, planning, and pricing. "Accelerating something that exists isn't very interesting," Mindy Simon, ConAgra's vice president of IT, said at a recent SAP event. "We're trying to transform what we're doing with integrated forecasting."
ConAgra, an $18 billion-a-year consumer packaged goods company, must quickly respond to the fluctuating costs of 4,000 raw materials that go into more than 20,000 products, from Swiss Miss cocoa to Chef Boyardee pasta. What's more, if it could make its promotions timelier by using faster analysis, ConAgra and its retailer customers could command higher prices in a business known for razor-thin profit margins.
"If there's a big storm hitting the Northeast this weekend, for example, we want to make sure that Swiss Miss cocoa is available in that end-cap display," Simon said. But a company can't do that quickly if it has to wait for overnight data loads and hours-long planning runs to get a clear picture of in-store inventories, available product, and the profitability tied to various pricing scenarios.
With its Hana platform, SAP has been an early adopter and outspoken champion of in-memory technology, but it certainly doesn't have a monopoly on using RAM for processing transactions or analyzing data. Database incumbents IBM, Microsoft, and Oracle have introduced or announced their own in-memory capabilities and plans, as have other established vendors and startups. The key difference is that the big three incumbents are looking to defend legacy deployments and preserve license revenue by adding in-memory features to conventional databases. SAP is promising "radical simplification," with an entirely in-memory database that it says will eliminate layers of data management infrastructure and cut the overall tab for databases and information management.
As the in-memory war of words breaks out in 2014, the market is sure to get confusing. To cut through the hyperbole, here's a close look at what leading vendors are promising and, more importantly, what early adopters are saying about the business benefits.
SAP's four promises If you're one of the 230,000 customers of SAP, you've been hearing about in-memory technology for more than three years, as it's at the heart of Hana. SAP laid out a grand plan for Hana in 2010 and has been trumpeting every advancement as a milestone ever since. It's now much more than a database management system, packing analytics, an application server, and other data management components into what the vendor calls the SAP Hana Platform. At the core of SAP's plans are four big claims about what Hana will do for customers.
First, SAP says Hana will deliver dramatically faster performance for applications. Second, SAP insists Hana can deliver this performance "without disruption" -- meaning customers don't have to rip out and replace their applications. Third, SAP says Hana can run transactional and analytical applications simultaneously, letting companies eliminate "redundant" infrastructure and "unnecessary" aggregates, materialized views, and other copies of data. Fourth, SAP says Hana will provide the foundation for new applications that weren't possible on conventional database technologies. ConAgra's idea for real-time pricing-and-profitability analysis is a case in point.
(Source: InformationWeek 2014 State of Database Technology Survey of 955 business technology professionals)
The first and biggest challenge for SAP is persuading customers to use Hana in place of the proven databases they've been using for years. Roughly half of SAP customers run their enterprise-grade applications on Oracle Database. Most other SAP customers use IBM DB2 and Microsoft SQL Server, in that order. SAP says it has more than 3,000 Hana customers, but that's just over 1% of its total customer base. IBM, Microsoft, and Oracle combined have upward of 1 million database customers. The dominance of the big three is confirmed by our recent InformationWeek 2014 State of Database Technology survey (see chart).
The value of faster performance Nobody questions that in-memory performance beats disk-based performance. Estimates vary depending on disk speed and available input/output (I/O) bandwidth, but one expert puts RAM latency at 83 nanoseconds and disk latency at 13 milliseconds. With 1 million nanoseconds
Doug Henschen is Executive Editor of InformationWeek, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of ... View Full Bio
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.