New studies from Constellation Research and IDC reveal that the business intelligence market is in transition, with analytics, visualization, and big data driving the fastest growth.
"BI is dead! Long live BI!"
This is the provocative title of a new report that concludes that business intelligence as we know it is in transition, becoming just one element of "a continuum of decision-management capabilities."
That continuum, according to the report's author, Constellation Research analyst Neil Raden, will include everything from predictive modeling, machine learning, and natural language processing, to business rules, data visualization, and what the report describes as "traditional" BI.
Rest assured: there's still a large and growing BI market, as revealed by the latest installment of IDC's annual BI and analytics market share report, "Worldwide Business Analytics Software 2012-2016 Forecast and 2011 Vendor Shares," which was released last week. But just the fact that the name of the IDC report has been changed from "Business Intelligence" to "Business Analytics" speaks volumes about which way the market is headed.
We all saw the hand wringing in recent years over BI not living up to its promise, with adoption rates below 20% or even 10% of potential users at many enterprises. But that's "probably the right level" given the limitations of legacy BI tools, says Raden. I couldn't agree more, and I've previously called for better ease of use, ease of deployment, affordability, and ease of administration.
What's largely missing from the BI landscape, says Raden, is the ability for business users to create their own data models. Modeling is a common practice, used to do what-if simulation and scenario planning. Pricing models, for instance, are used to predict sales and profits if X low-margin product is eliminated in hopes of retaining customers with products A, B, and C.
Insurance companies use models to map out their policies by region and predict claims in the event of a category 5 hurricane or at various flood stages. Yield management models are used by hotels and airlines to fill rooms and seats. Risk and contingency models are used by financial services to foresee loan failure rates and plan reserves against losses.
The models described above are the province of sophisticated analytics teams, but Raden says business users need flexible and easy-to-use tools for modeling. Microsoft Excel and budgeting and planning applications come closest, he says, but we need tools that are less prone to creating data inconsistencies and version-control problems than Excel, on the one hand, and that are more accessible to business users than budgeting and planning apps, on the other.
Where corporate data is concerned, BI is too often locked into read-only reporting against fixed-schema data warehouses. Want to add a new data attribute? Well, that will require an IT project and a few days or weeks of work to change the schema. Such rigidity is just not in keeping with an era that's supposed to be about embracing new data sources and supporting decisions with deep insight.
Raden lays out a dozen data-modeling best practices that he says will lead the way to better BI. He wants visual modeling tools that hide the complexity of selecting data sources and improve understanding. He calls for zero coding, so the act of exploring new data sources doesn't demand a degree in programming. And he asks for robust collaboration and workflow capabilities, so insights can be shared and connected with business processes.
It's no coincidence that IDC's latest BI and analytics market share stats show that the three fastest-growing vendors in the industry are Tableau Software, QlikTech, and Tibco Spotfire, with reported growth rates of 94%, 43%, and 23% in 2011, respectively. All three blend data visualization, analytics, and high-scale in-memory analysis capabilities. In my view they're moving toward the kind of flexible and accessible analysis environments that Raden calls for. Their interfaces and approaches are being imitated by larger vendors, though it's too soon to say whether those efforts will transform the way people interact with BI.
IDC forecasts that advanced analytics (the uber category for predictive modeling and machine learning) will grow 10.1% per year through 2016 and content analytics (the parent of natural language processing) will grow 14.5% per year through 2016. Traditional BI query, reporting, and analysis tools, meanwhile, will see still-impressive 9.5% annual growth, according to IDC.
The data warehousing platform category is expected to grow 11.2% per year through 2016. Open source, nonrelational big data platforms such as Hadoop and NoSQL databases will mostly run alongside existing business analytics systems, IDC predicts, but in a few cases will cut into conventional data warehousing sales.
Most of the cost of big data platforms is in hardware and services, not software, says Raden. Nonetheless, the demands of big data analysis make flexible data modeling all the more important.
Business people want software that lets them "address only the meaning of data--not its structure, location, or format," Raden explains.
That was true back when BI tools analyzed relatively small quantities of run-of-the-mill transactional data. It's an even bigger imperative now that the volume, variety, velocity and complexity of data are getting harder to manage.
The Agile ArchiveWhen it comes to managing data, donít look at backup and archiving systems as burdens and cost centers. A well-designed archive can enhance data protection and restores, ease search and e-discovery efforts, and save money by intelligently moving data from expensive primary storage systems.
2014 Analytics, BI, and Information Management SurveyITís tried for years to simplify data analytics and business intelligence efforts. Have visual analysis tools and Hadoop and NoSQL databases helped? Respondents to our 2014 InformationWeek Analytics, Business Intelligence, and Information Management Survey have a mixed outlook.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?