SEATTLE, October 6, 2008 — Microsoft will bow in-memory data analysis capabilities and support for large-scale data warehouse deployments (on third-party appliance hardware) by 2010. That's the big news here at Microsoft's second annual Business Intelligent Conference, which got underway this morning with more than 2,500 attending. It's big news on both counts, but nothing shocking.
Microsoft had preannounced that it would lay out its roadmap for the recently acquired DATAllegro data warehousing technology at this event. And with in-memory technology already available from multiple competitors, it's no surprise to hear that SQL Server Analysis Services will gain what Microsoft calls "managed self-service" analytics and reporting.
Microsoft used code names to describe what will bow as a community technology preview (beta release) in 2009, with general availability expected in "the first half of 2010." Most of the news is part of "Kilimanjaro," which is the code name for a coming BI-focused upgrade of Microsoft SQL Server 2008. The core new news here is "Project Gemini," which fits in with Microsoft's "People-Ready-BI" theme.
"Project Gemini will allow us to greatly expand the number of users who are gaining insights out of data, while doing it in a way that IT can still manage and govern," said Tom Casey, General Manager, SQL Server Business Intelligence, in a pre-conference briefing with Intelligent Enterprise. "It's essentially another storage mode for Microsoft SQL Server Analysis Services with access via MDX, so existing applications will be able to take advantage of the performance enhancements."
Tapping into in-memory capabilities built into Analysis Services with the aid of an add-in to the Excel client, users will reportedly be able to slice, dice and filter vast data sets (into the millions of rows) without aggregations or prebuilt cubes developed by IT.
During today's presentations, Microsoft's Donald Farmer demonstrated a live analysis of 20 million rows of data in Excel with fast-moving sorting, filtering and slice capabilities as well as on-the-fly mashups of corporate data with external data sources.
Users will store and share the models, analyses and reports they develop on SharePoint, which will add an element of centralized control.
"When people share their documents via SharePoint, Analysis Services ensures the management and storage of the data and the models implicitly created," Casey said. "That eliminates the problem of having a bunch of spreadsheets running wild because it ensures that users are interacting with common, centralized data."
With analyses living in SharePoint, Casey said IT can administer access rights. Power users and subject matter experts will also be able to effective new analyses and approaches developed by users that might be replicated elsewhere. He added that users will be able to analyze millions of rows even on sub-$1,000 desktop machines, though performance will improve with upgrades to 64-bit, multiprocessor hardware.
On the data warehousing front, "Project Madison" is the code name for what Microsoft is doing with DATAllegro. "This will enable us to scale out for data warehouse deployments with many hundreds of terabytes," Casey said. "What we'll show this week is that we have already replaced DATAllegro's underlying storage and query processing layer to run on SQL Server, so we've made very good progress."
Microsoft won't be saying anything about appliance packaging, pricing or distribution this week, But Casey said the hardware will come from "the usual, industry-standard hardware partners, including HP, Unisis, Dell and Bull."
With projects Gemini and Madison in 2010, Microsoft will be late to the party with both in-memory and appliance-based data warehousing. IBM Cognos gained in-memory support last year with its acquisition of Applix. SAP Business Objects has made it clear it will exploit the in-memory capabilities built into the SAP BI Accelerator. And independents including QlikTech and Spotfire have been highlighting in-memory technology for years. Oracle, too, acquired in-memory technology several years ago, with the purchase of TimesTen. But Microsoft's announcement will put pressure on Oracle to productize on-the-fly analysis in more prominent way.
Microsoft's time line for Project Madison ends industry speculation on how long it will take the company to embed DATAllegro into SQL Server (the answer being 18 to 24 months, depending on the exact date of Madison's release). In the interim, the dozen-plus independent data warehouse appliance vendors will presumably continue to grow. And just last month, Oracle announced immediate availability of its HP-Oracle Database Machine and HP-Oracle Exadata Storage Server.
Even if Microsoft is late to large-scale datawarehousing, its current customer base, which is dominated by small- and midsize-businesses, isn't quite ready to join those ranks anyway. Thus, Microsoft may be there in plenty of time to grab a big chunk of future mainstream adoption.
With the auto query tuning, auto load balancing and Resource Governor capabilities in SQL Server 2008, Microsoft can already address 75 percent of data warehousing needs," commented Forrester analyst Jim Kobielus. "Once they get up into the hundreds of terabytes with the DATAllegro technology, they'll address more than 90 percent of customer needs."