Enterprise IT organizations still aren't ready to go all-in on open source, virtualized and cloud databases. But having those options on the table gives buyers some negotiating leverage with database vendors.
Among the 716 respondents to our InformationWeek2013 State of Database Technology Survey, 91% of those interested in an integrated analytical platform are using, piloting or investigating Hadoop; 16% are making use of the cloud for their primary databases; and 43% are virtualizing their primary databases, up six points since 2012.
When it comes to pricing, most survey respondents say the ideal model for primary databases, data warehouses and data marts is an enterprise license, with no limit on usage. In our 2012 survey, just 28% of respondents said they had an enterprise license for their primary databases, 35% for their data warehouses and 29% for their data marts. This year, those percentages are up to 33%, 36% and 37%, respectively. Customers are making that progress in part because competing open source and cloud database vendors are offering innovative pricing models tied to customer business metrics. Possibly the best-known cloud options are Amazon Web Services' NoSQL SimpleDB and Amazon Relational Database Service, but Microsoft, Google, Heroku and others also offer database-as-a-service.
"We've reached an interesting time in the development of databases," says Joe Masters Emison, founder and CTO of BuildFax, which collects and organizes construction data on millions of properties. "Open source databases, which are free to use, are every bit as powerful as extremely expensive commercial options." Proof: MySQL powers all or part of nine of the top 10 most-visited websites worldwide. Emison also cites the rise of specialized databases -- NoSQL, graph and in-memory databases -- to handle atypical use cases, such as Facebook's Graph Search or Wall Street traders using real-time analysis. At one time, IT organizations would have had to spend lots of money to scale their relational databases for such applications.
Then there's the maturation of object-relational mapping (ORM) -- effectively mapping software objects into relational databases in a standard fashion. "Most new development is done with ORM, which makes databases more interchangeable: You don't have to alter your code to change your database server," Emison says. "Add to this the rise of cloud computing, and organizations are no longer locking themselves into a particular database server for the life of an application."
All the chips aren't falling in IT's favor, however. For one thing, data management and analytics expertise is in short supply. At least 140,000 positions in those fields will remain unfilled by 2018, according to a recent InformationWeek report on the talent gap in big data analytics. Meantime, social media, mobile and myriad other smart applications -- think Internet-enabled security systems, electronic medical records and geographic systems -- are increasing both the size and complexity of data stores.
In addition, respondents aren't making full use of virtualization and cloud. Yes, 43% use virtualized environments for their primary databases, but that means the majority are still holding back, most citing performance worries. As our 2013 Virtualization Management Survey shows, application availability is a key focus of next-generation hypervisor releases, so we expect those worries to subside in short order.