Microsoft SQL Server "Project Hekaton" promises in-memory transactional processing that will stack up against SAP Hana and Oracle Exadata.
Microsoft made three announcements related to its SQL Server database business Wednesday, but the headliner was clearly "Hekaton," a preview project that promises to bring in-memory OLTP (transactional) processing to the next major release of the database. With that move, which analysts expect by 2014 or 2015, Microsoft will have its in-memory answer to both SAP Hana and Oracle Database with its Exadata and Exalytics appliance options.
In Microsoft's other two announcements, the company released Service Pack 1 (SP1) for Microsoft SQL Server 2012 and it announced that the next release of the Microsoft SQL Server Parallel Data Warehouse (PDW) will debut in the first half of 2013. With that, PDW will move from SQL Server 2008 R2 to SQL Server 2012, and it will also gain a new feature called PolyBase, which will support federated querying against the SQL Server 2012 relational database as well as data in Microsoft's just-released Hadoop platform, Microsoft HDInsight Server for Windows.
PolyBase contrasts with the SAP and Oracle approaches to big data in that it provides a single point of federated querying with Hive access to the data in Hadoop. That means you can leave the data in Hadoop in place, whereas Microsoft's competitors provide connectors and suggest that customers take the step of moving MapReduce result sets into their relational database environments.
Hekaton is the Greek word for 100 times, and Microsoft says that's the design goal for the peak performance improvements it's expecting. In early lab testing of Hekaton, Microsoft executives said they're already seeing a 50 times improvement in transactional application performance. When the technology finally reaches the market with "the next major release of SQL Server" (the company offered no target dates for that step), Microsoft said it will fill in the last piece of an in-memory computing strategy that will have competitors beat.
"Our design approach is built into SQL Server," explained Doug Leland, Microsoft's general manager of SQL Server marketing, in an interview with InformationWeek. "SAP and Oracle are basically forcing customers to buy, learn and manage a separate solution, but we're taking a differentiated approach which is building the [in-memory] technology directly into the data platform that customers are already using."
Customers will be able to install Hekaton on the commodity servers they're using today, according to Microsoft, and the software will take advantage of the resources that are available in the box. "The system will look at how much storage and memory you have available and will make choices about what tables it loads up into main memory," Leland said.
SQL Server customers expecting to move large or particularly demanding portions of database workload into memory would clearly have to increase available RAM in order to get true in-memory performance, but the database could always fall back on conventional disk-based retrieval of information if RAM isn't available. In SAP's approach, performance "falls off a cliff" when memory capacity requirements are surpassed, according to Leyland, necessitating software and hardware upgrades. And in contrast to Oracle's cache-centric Exadata approach, Hekaton will deliver true in-memory performance.
So how will things stack up in a three-way transactional processing competition between SAP Hana, Microsoft SQL Server with Hekaton and Oracle Exadata? "The Flash used in Exadata is not memory, but it is fast," Gartner analyst Don Feinberg said in an interview with InformationWeek. "You give up the potential for real-time analytics and it won't be as fast as Hana or Hekaton, but for most clients of Exadata, it's going to be fast enough."
Feinberg expects that Oracle, too, will bring in-memory capabilities to Oracle database -- something currently limited to the TimesTen database in the Exalytics appliance -- but there's no telling when that might happen. SAP is promising core application transaction processing on Hana sometime next year. If Hekaton has to wait for the next "201X" release of SQL Server, that would not be likely to be released until late 2014 (at the earliest) as SQL Server 2015, though an R2-designated 2012 release might bring it to market sooner.
Microsoft has already delivered in-memory business intelligence and analytics with Microsoft PowerPivot and Power View. PowerPivot was introduced as an in-memory-analysis plug-in to Microsoft Excel with the release of SQL Server 2008 R2 in 2010. Power View is an in-memory data visualization option introduced in April with the release of SQL Server 2012.
With the release of SQL Server 2012 SP1, the database will support the upcoming release of Office 2013, which builds PowerPivot and Power View directly into Microsoft Excel as data-exploration and visualization options available from the ribbon menu. That means customers will no longer have to download plug-ins to Excel in order for users to tap into data on enterprise SQL Server databases.
Microsoft says its PowerPivot, Power View and X-Velocity in-memory column-store and multidimensional-analysis capabilities built into SQL Server 2012 are already accessible to 1.5 million customers. With Office 2013, now available in beta and expected early next year, in-memory analysis will be accessible to all users of the next generation of Excel.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?