4 Technologies That Are Reshaping Business Intelligence
Next-generation BI is being formed by predictive analytics, real-time monitoring, in-memory processing, and SaaS.
Past performance is no guarantee of future results. This investment-prospectus lingo has never been more apt for business in general than in this post-financial-meltdown, pre-recovery economy. Yet now more than ever, top executives, corporate directors, and financial markets want no surprises.
So it's pretty clear why business intelligence initiatives continue to top CIO priorities, as executives from the boardroom on down demand better visibility. The problem is that BI often has fallen short of ideal, delivering insight into the past but not into up-to-the-moment performance or future prospects.
That's about to change. Next-generation BI has arrived, and three major factors are driving it: the spread of predictive analytics, more real-time performance monitoring, and much faster analysis, thanks to in-memory BI. A fourth factor, software as a service, promises to further alter the BI market by helping companies get these next-generation systems running more quickly.
Predictive analytics is a white-hot growth segment that got hotter with IBM's $1.2 billion deal to buy SPSS, a company that uses algorithms and combinations of calculations to spot trends, risks, and opportunities in ways not possible with historical reporting.
Between the extremes of rearview-mirror reporting and advanced predictive analytics lies real-time monitoring. Front-line managers and executives increasingly want to know what's happening right now--as in this second, not yesterday or even 10 minutes ago. This is where stream processing technologies are moving beyond niche industry uses. Real-time monitoring detects events or patterns of events as data streams through transactional systems, networks, or communications buses. Proven on Wall Street and in other data-soaked industries, stream processing technologies deliver subsecond insight that conventional BI can't touch.
Forward-looking and real-time analysis aren't brand-new BI concepts, but in-memory processing is making them more practical. Until next-generation in-memory products emerged, you usually needed pre-built cubes, pre-defined queries, summarized data, and long-running queries for "what if" exploration. All those requirements killed spontaneous exploration. In-memory products, unlike tools that explore historical data on disk, load vast data sets into RAM so people can perform queries in seconds that would take minutes or even hours with conventional tools.
Scouring demand data can flag trends and problems sooner.
The fourth factor in the next generation of BI addresses another place where speed is needed: in the deployment phase. With software-as-a-service options, BI doesn't always require the months-long distraction of building a data warehouse or a new data mart application, something particularly attractive for small IT shops (see story, "SaaS Makes Its Mark In Business Intelligence".
This next generation of BI technology is still evolving and comes with plenty of risk. Prediction typically requires statistical expertise that's scarce and pricey. Real-time monitoring of stream processing technology can be a lifesaver, but only if you can respond as quickly as you detect opportunity or risk. Fast in-memory-analysis tools are selling briskly, but they may require companies to pony up for higher-performance 64-bit hardware. And if you're going to expose these powerful BI tools to new users, be mindful of misinterpretation.
Avoid these pitfalls, however, and there's no turning back to guesswork forecasting, weeks-old reports, and glacial querying.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?