Advanced analytics is all about statistical analysis and predictive modeling -- being able to see what's coming and take action before it's too late, rather than just reacting to what has already happened. That latter practice, derisively known as "rearview-mirror reporting," is associated with conventional BI.
The more data companies use, the more accurate their predictions become. But the big data movement isn't just about using more data. It's also about taking advantage of new data types, such as social media conversations, clickstreams and log files, sensor information and other real-time feeds. Experienced practitioners are taking cutting-edge approaches, including in-database analytics, text mining and sentiment analysis.
In each of the past six years, respondents to our analytics and BI survey have rated their interest in 10 leading-edge technologies, and advanced analytics has always been the No. 1 choice. Advanced data visualization is No. 2 this year, up from being ranked third in 2009 (see chart at right). Last year we added "big data analysis" to the list of cutting-edge pursuits, and this year it ranked No. 4 along with collaborative BI.
We also see clear evidence that companies are investing in software, people and advanced techniques. For starters, this year we added "in-database analysis for predictive or statistical modeling" to our list of leading-edge technologies, and respondents rated their interest higher than for more-established categories such as mobile BI and cloud-based BI.
With in-database analysis, statistical and predictive algorithms are rewritten to operate inside databases that run on massively parallel processing (MPP) platforms. In-database analysis is faster than the old approach to data mining, where analysts moved data sets from data warehouses into specialized analytic servers to create and test predictive models. Data movement delays plagued the old approach, and the analytic servers were underpowered. As data sets have grown, time and power constraints limit work to small data samples rather than all available information, limiting the accuracy of the resulting models.
Businesses that have embraced in-database approaches say they can develop models in less time for more precisely targeted segments, whether they're trying to predict customer behavior, product performance, business risks or other variables. What's more, MPP power lets them crunch through massive data sets, so they can use all available data and deliver far more accurate models.
6 Tools to Protect Big DataMost IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.
Big Data Brings Big Security ProblemsWhy should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.
InformationWeek Must Reads Oct. 21, 2014InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.