I've been looking for the past couple of weeks, searching in vain for evidence of the analytics continuum that was a staple of data-centric conferences and research papers for the past five years. Has it gone the way of the PDA (that's personal digital assistant; the other PDA is still popular), booted out of our lexicography by advancing technology?
You remember the continuum, it usually featured three or four subheads and a paragraph or 10 explaining each stage in PowerPoint and research reports. You had "descriptive analytics." You didn't want to be stuck there, simply analyzing what happened last year.
The next stage was "predictive analytics." Yay, we can see where we are going next year.
Then we always saw "prescriptive analytics." Almost to Nirvana, it was where our computers would recommend possible solutions to our challenges, and weight them by likelihood of success. Only a fraction of our companies had prescriptive analytics in production at the start of 2017.
The courageous conference speaker often would cap the continuum presentation by adding Stage Four. Depending upon on the presenter, that could be called "cognitive computing" or "AI/machine learning." The machine would fix our problems for us. Can you say, "Early retirement"?
Now, I understand that analytics professionals hate the term anecdotal evidence because it isn't really data. However, it's all I have to work with so far. Simply, I don't recall the last time -- maybe it was May at Interop -- when I witnessed new references to either "predictive analytics" or "prescriptive analytics." I'll note that I see a dozen or more news pitches, tech papers, and feature articles most days, so there are tons of opportunities for those terms to appear.
What happened? Consultants, analytics, software vendors, cloud providers, even the CIOs, CAOs, and data scientists who head up data analytics projects have skipped right to Stage Four, at least in how they talk, if not in their actions. They are AI-driven and machine learning focused.
If what folks around the industry say is true, AI is easy. Maybe you can pick up some AI on the way back from the grocery store. We're hearing that AI can do everything.
It even can eliminate that trip to the supermarket by knowing what you want for dinner and getting the ingredients delivered by drone. If you insist on going to the market, don't worry, AI will drive your car. AI is ready even to serve you in the bedroom with an AI-based pillow that adjusts to your sleep habits, and an AI-based sex toy. (No need to get into details on the latter).
We're just missing a few key steps here. The most obvious is that much of the AI miracle still resides in development and test labs. But even more significant is the part that the AI cheerleaders overlook, and that is the hard work that has to go into so many of these projects.
That hard work involves tasks such as finding and vetting data sources, cleaning and managing data, securing systems, protecting data privacy, scaling from pilot projects, and building out models and algorithms. Gee, these are all the things that those of you in the All Analytics community do for a living.
The nice thing about the lost continuum is that it walked us through a process, and that process was intended to help us get things right. Let's hope someone puts it back in business before we get smothered by our smart pillows.