Software // Information Management
News
10/16/2013
09:07 AM
Connect Directly
LinkedIn
Twitter
Google+
RSS
E-Mail
50%
50%

Microsoft SQL Server 2014: Final Countdown

Microsoft steps closer to delivering in-memory performance with a second beta release of Microsoft SQL Server 2014. Here's a peek at what's coming.

Microsoft's in-memory database project formerly known as Hekaton took a step closer to general release on Wednesday as the company introduced a second community technology preview (CTP) of Microsoft SQL Server 2014.

Hekaton is now known simply as In-Memory Online Transaction Processing (OLTP). As a built-in feature of the database, In-Memory OLTP will let administrators move selected tables into main memory, thereby delivering dramatic performance improvements without having to touch the applications running on the database. That could be custom-built applications or packaged applications from Microsoft itself or third-party vendors including Oracle, SAP and thousands of others.

SAP was an early proponent of in-memory technology with Hana, now an extensive data-management platform that spans database, application server and other functionality. Oracle recently joined the in-memory bandwagon by announcing an In-Memory Option for the Oracle 12c database, but it didn't announce preview or release dates, other than to say it would arrive "in calendar year 2014." It's unknown whether that will be a beta or a general release.

Microsoft introduced SQL Server 2014 and its In-Memory OLTP option in its usual methodical way, starting with the announcement last November and CTP 1 in June. That first preview release has seen more than 36,000 downloads, according to Microsoft, and at least 100 customers have worked closely with Microsoft on tweaks and new features introduced in CTP 2.

[ Want more on Oracle's promised in-memory database option? Read Oracle OpenWorld: 5 Rants And Raves. ]

The list of early CTP1 early adopters included SQL Server customers TPP and Ferranti Computer Systems. TPP is a clinical software provider in England that handles 30 million patient records and as many as 72,000 concurrent users. TPP reported a seven-times improvement in overall system performance with selective use of In-Memory OLTP as compared to its previous deployment on SQL Server 2008 R2. The key difference was faster access to patient records, which will give doctors and nurses more time with patients.

Ferranti implemented CTP 1 and in-memory optimization in hopes its software, used by energy and utility companies, could keep up with exponentially faster and larger smart-meter workloads. Where mechanical utility meters were read once per month, smart meters take readings every 15 minutes. As a result, Ferranti's software will have to go from handling 5 million measurements per year to processing 500 million measurements per day. The software wasn't keeping up with smart-meter workloads running on SQL Server 2012, but Ferranti reports that In-Memory OLTP made it possible to process and store the data as quickly as it's sent by the smart meters.

The short list of improvements in CTP 2 includes broader SQL compatibility and support for additional index types. New tools have been introduced to help administrators access database performance and data-access patterns so they can determine which tables would most benefit from in-memory optimization. The in-memory feature has also been integrated with the SQL Server Always-On high-availability feature, a sign that users can count on continuous operation.

With CTP 2, Microsoft said it expects to see tens of thousands more downloads of the software and further refinements to what will become the final, general release of SQL Server 2014. The exact date of that release is still sketchy: set vaguely in the first half of 2014.

But Microsoft's coming release is far more tangible than Oracle's promised in-memory option. There have been tens of thousands of downloads of CTP 1 and lots of detailed blogs and white papers about what's coming.

In contrast to SAP Hana's all-in-memory approach, Microsoft is not proposing moving everything into memory. That means it's not doing away with aggregations or conventional disks or proposing the "dramatic simplification" of infrastructure that SAP is touting. Nor do the promised performance gains sound quite as dramatic as SAP's claims.

On the other hand, SQL Server 2014 can run on existing servers -- assuming available DRAM is in line with the scale of the workloads customers want to run in memory. Most reassuring of all, Microsoft is promising that if it runs on SQL Server 2012 it will run on SQL Server 2014.

Of course, we'll have to wait for general release and for third-party software providers to certify on SQL Server 2014. But given that this is the most deployed database in the world (Oracle is the market leader in terms of license revenue), the promise of performance improvement without disruption is palpable.

Comment  | 
Print  | 
More Insights
Comments
Oldest First  |  Newest First  |  Threaded View
D. Henschen
50%
50%
D. Henschen,
User Rank: Author
10/16/2013 | 7:03:47 PM
re: Microsoft SQL Server 2014: Final Countdown
I interviewed Quentin Clark, Corporate VP of the Data Platform Group at Microsoft, as part of my research for this article. It was Clark who announced CTP 2 on Wednesday, and here's what he told InformationWeek about Microsoft's two key in-memory competitors:

"There are in-memory solutions that exist that are new products [Author's note: He's clearly talking about SAP Hana here]. What they tell customers is, 'if you have an existing application or a packaged app, it will have to be rewritten.'

We don't subscribe to that view. We are building in-memory capabilities into Microsoft SQL Server in a way that's compatible so that customers will not be faced with rewriting their applications. That, alone, weeds out nearly all in-memory databases that are commercially available.

The other thing we can look at is where we are in development. Between our in-memory columnar work [for analytics], which was introduced in SQL Server 2012, and our in-memory transaction processing work, which will release as part of SQL Server 2014, we're light years ahead of where our competitors are [Author's note: Here he's clearly talking about the coming Oracle in-memory option for 12c]. They may have announced something, but there's no product there and there's not even a description of when that product may come."
rradina
50%
50%
rradina,
User Rank: Ninja
10/16/2013 | 10:28:37 PM
re: Microsoft SQL Server 2014: Final Countdown
How granular is the in-memory option? When they say table, does a table include all of its indexes or can you pick an choose any object you want to put in RAM? Of course it doesn't make a lot of sense to put the table in RAM and leave indexes on disk but sometimes a heavily modified table has a lot of referential integrity lookups. Sometimes those lookup tables aren't frequently updated and to satisfy the RI check, it might require less RAM to just put the checked index in memory instead of the table and all of its indexes.
D. Henschen
50%
50%
D. Henschen,
User Rank: Author
10/17/2013 | 12:19:48 AM
re: Microsoft SQL Server 2014: Final Countdown
Follow the links in the article above to the many blogs and white papers Microsoft has made available on the technical details of the implementation. CTP 2 brought new capabilities.
rradina
50%
50%
rradina,
User Rank: Ninja
10/17/2013 | 8:25:55 PM
re: Microsoft SQL Server 2014: Final Countdown
OK -- it looks like the whole table needs to be in memory and there are some interesting restrictions. Maximum of eight indexes, there must be a primary key and it cannot be an identity column. That last one caught be by surprise because I was thinking I could upgrade a DB to 2014, move a few tables into memory and experience a nice performance improvement. However, the restrictions mean it's far more deliberate than I was initially thinking. (i.e. max out the server's RAM, upgrade to 2014, move the hot tables into memory and after lunch, play a round of golf.)
kfarlee
50%
50%
kfarlee,
User Rank: Apprentice
10/18/2013 | 5:39:38 PM
re: Microsoft SQL Server 2014: Final Countdown
The granularity of objects that can be placed in memory is indeed the table and its associated indexes. the indexes are significantly different to take advantage of the fact that all rows in the table can be reached by a pointer reference. This contrasts with most or all other in-memory offerings, where the granularity is the database.

This means that you can approach adoption incrementally, moving only the data that is a performance issue, and not investing in tables which are not a problem.

Primary keys are indeed required for durable tables, however
tables that are declared as non-durable do not require them.

Identity is currently a limitation in the product which we
are actively working to eliminate.
You are correct that this technology isn't a substitute for understanding where your bottlenecks and performance issues are, and being targeted in your enhancements.
rradina
50%
50%
rradina,
User Rank: Ninja
10/18/2013 | 7:39:22 PM
re: Microsoft SQL Server 2014: Final Countdown
Here's to the hope that the identity issue can be resolved. Then it might be possible to take an existing application, with some other more minor index restrictions, and move heavily updated tables into RAM with few or zero schema changes.

IMO it's absolutely impressive that some objects in an existing DB could be moved to RAM thereby eliminating a lot of lock overhead. If the identity issue can be solved, I think that will be a huge upgrade draw for those looking offer improved performance without radically redesigning the application or entertaining a significant infrastructure upgrade (other than adding lots of RAM).
shonthom
50%
50%
shonthom,
User Rank: Apprentice
10/30/2013 | 3:38:57 PM
re: Microsoft SQL Server 2014: Final Countdown
Oracle times ten operates on in memory databases, and it has already been released. This article mentioned the in memory options in 12c, so what's the difference between the 12c option and the current times ten features?
sgittleson
50%
50%
sgittleson,
User Rank: Apprentice
12/30/2013 | 5:45:12 AM
Confused
I thouight 2012 puts SQL in Ram already? If you DB is small enough

 

So for smaller DB's say 150GB in size. If I have 256GB of RAM on the 2008 R2 enterprize server. 2012 is already putting the DB in memory if my DB is 150GB?

 

So are you saying this feature is new in terms of that for larger DB's 200GB+ you can now choose what tables & indexs you want to add to RAM.

 

So for smaller DB's sub 200GB where you have enough RAM for the entire DB SQL 2014 will not offer any new performance benefits?

 

Thanks,

 

-Steven
D. Henschen
50%
50%
D. Henschen,
User Rank: Author
12/30/2013 | 1:43:36 PM
re: Microsoft SQL Server 2014: Final Countdown
TimesTen is an all-in-memory database that has been around for quite some time, but it has always served in a niche-applications role, in industries such as telecommunications. It is not a general-purpose database like Oracle Database or Microsoft SQL Server. The reason Microsoft and Oracle are introducing in-memory options for their flagship databases is to give customers the option of accelerating any application or certain aspects of applications that demand low latency.
anon5809147095
50%
50%
anon5809147095,
User Rank: Apprentice
3/14/2014 | 3:46:52 PM
Re: Confused
Good luck you'll have a 6 hour wait time for the DB to mount with 250GB in memory if you restart the sql instance.
The Agile Archive
The Agile Archive
When it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest - August 20, 2014
CIOs need people who know the ins and outs of cloud software stacks and security, and, most of all, can break through cultural resistance.
Flash Poll
Video
Slideshows
Twitter Feed
InformationWeek Radio
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.