Microsoft SQL Server 2012: Big Data Power Incomplete
Microsoft's years-in-the-making database upgrade finally arrives, but related Hadoop components and services are still in preview mode.
Microsoft announced Tuesday that it has released SQL Server 2012 to manufacturing with a splashy headline about managing "any data, any size, anywhere." But in a pattern that has become familiar for Microsoft, you have to add an "any time now" caveat to the SQL Server 2012 big data story.
As the successor to Microsoft SQL Server 2008 R2, the 2012 update is a big deal. It includes an in-memory data-visualization component called Power View that Microsoft has been developing and talking about for years under the code name Crescent. The release brings more than 200 performance, scalability, and reliability improvements to the database.
A new SQL Server AlwaysOn high-availability and disaster recovery system is said to reduce planned and unplanned downtime. A new ColumnStore Indexing feature is said to boost performance by up to 10 times when doing star joins and analytic queries that make use of selected columns of data. A BI (business Intelligence) Semantic Model helps users tie together reporting, analytics, scorecards, and dashboards. New security and encryption features let administrators separate duties, expose services and features to the right people, protect data, and ensure compliance.
To support hybrid IT spanning in on-premises, private cloud, and public cloud deployments, SQL Server 2012 provides a common architecture and mobile licensing approach so you can choose and shift between conventional servers, appliances, and SQL Azure on the Microsoft Azure public cloud. A SQL Azure Data Sync service provides bidirectional synchronization across databases in the data center and running in the cloud.
All of these aspects of the release and more can be tested immediately through a downloadable evaluation version of the database. With code released to manufacturing on Tuesday, SQL Server 2012 will become generally available on April 1. And thanks to Microsoft's methodical community technology preview release approach, you can be assured that more than 150,000 customers have already kicked the tires on SQL Server 2012 and provided feedback that has been rolled into improvements to the final product.
"SQL Server 2012 is very significant, as befits a major release, [and] there's no reason to believe Microsoft's steady growth in database marketshare will be anything but helped," Gartner analyst Merv Adrian told InformationWeek.
As for the headline-and-hyperbole-highlighted big data aspects of the release, the only components that are production-ready today are bidirectional connectors to the Apache Hadoop framework. Those were announced in October and released late last year along with a preview Hadoop service on Microsoft SQL Azure.
It's a decent start in that Microsoft customers can at least move data from SQL Server into Hadoop and bring boiled-down MapReduce processing result sets from Hadoop back into SQL Server. But meanwhile, all of Microsoft's major big-data rivals, including EMC, IBM, and Oracle, are already shipping complete Hadoop software distributions (the last in partnership with Cloudera).
How long will the open-source approval process take? It's only "a matter of weeks" according to Hortonworks, but once that's done, Microsoft will then have to go through its usual community technology preview process. Microsoft is sticking to the release estimate it offered last October, which was calendar 2012 (so think by the end of the year).
Microsoft also announced on Tuesday a second community technology preview of its Hadoop service on SQL Azure. This second release adds support for predictive analytics software and new features for resiliency and high-availability for the HDFS (Hadoop Distributed File System) name node. It also expands the preview from 400 customers to more than 2,000. Meanwhile, plenty of other options for Hadoop processing--notably from Amazon, IBM, Cloudera, and others--have been available for many months, if not years.
Will Microsoft still be relevant to the big data world once everything comes into place (hopefully) later this year? "None of [the competing Hadoop distributions] run in Windows, but plenty of IT shops do and have hoarded data there that they will want to use," commented Adrian. "So Microsoft will absolutely be relevant and will have the inside track to the Windows community."
Much the same could be said for Microsoft SQL Server 2008 R2 Parallel Data Warehouse (PDW) release, which remains the only Windows-friendly massively parallel processing platform. Yet more than a year after PDW's release, little is seen or heard of that product. The company seems to be trying to get PDW kick started with a half-rack appliance configuration introduced with SQL Server 2012 release. But for now, Microsoft is about as relevant to high-scale data warehousing today as it is to the smartphone and tablet markets.
There was a surprisingly strong hue and cry when Microsoft scuttled Dryad in favor of Hadoop, so perhaps things will be different for the last-to-the-market Hadoop-on-Windows release. We'll have to wait and see.
Find out how to move beyond server virtualization to build a more flexible, efficient data center in the new Private Cloud Blueprint issue of Network Computing. (Free registration required.)
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?