We have entered the immersive age, where machines are controlled with human gestures and voice, and the concept presents an exciting future for analytics.

Jen Underwood, Impact Analytix

June 2, 2017

3 Min Read
Image: Narrative Science and Sisense

We are witnessing a fascinating change happening in analytics and across all industries. Novel immersive user experiences are just beginning to emerge that leverage artificial intelligence and human language. Natural Language Generation (NLG) is powering conversations between man and machine. NLG is empowering the masses.

In the industrial age, humans operated pedals, gearshifts, wheels, and switches. Machines would provide feedback via gauges or signals. In the information age, humans have interacted with machines by typing, pointing, clicking, or touching a screen. In the immersive age, machines will be controlled with human gestures and voice. These changes in human computer experiences will influence exciting future analytics product designs. In fact, it is already happening.

NLG technology is inspiring an entirely new generation of analytics solutions and supplementing traditional ones. With NLG search user interfaces, users can interact with data just as they would a simple Google search. This popular capability found in Thoughtspot and other offerings streamline basic data analysis by allowing users to ask questions of their data.

Taking it one step further, NLG user interfaces with added voice text-to-speech (TTS) are popping up in analytics “bots”. These solutions are enabled by technologies such as Alexa. Alexa is the Amazon service that allows humans to interact with devices using voice.

Analytics savvy NLG APIs such as Narrative Science's Quill or Automated Insights Wordsmith are also game changing. These solutions can automate identifying insights from data and explain findings. They are being used right now in healthcare, insurance, news, sports, financial services, business intelligence and CRM applications. Narrative Science demonstrated the combination of  NLG with TTS for analytics at the 2017 Gartner Data & Analytics Conference.

We first got a glimpse into what NLG could do for analytics when innovative BeyondCore, acquired by Salesforce in September 2016 and incorporated into Salesforce Analytics Cloud Einstein, entered the market. BeyondCore automated diagnostic, predictive and prescriptive insights.

The hidden benefits

The most compelling aspect of that application was not analytics automation. It was the user-friendly advanced analytics explanations with recommendations. Previously only data scientists might be able to interpret the output. By making complex analytics concepts easy for anyone to understand, knowledge that was once limited to a select few could finally be extended to everyone.

Automatically discovering and explaining insights with NLG is valuable. It clearly saves time when compared to manual approaches. Less apparently, NLG also reduces potential reporting bias. Humans may not realize the many hidden biases in data collection and interpretation of analysis. From selective perception to outcome bias, machines should minimize analytics bias since they are not motivated to manipulate the data. Unlike humans, machines cannot tell a lie.

Another amazing aspect of applying NLG to analytics is the ability to provide human context. Anyone that practices analytics knows context is key. When combined with artificial intelligence, historical usage data, prior questions, prior decisions, location and other important semantics, NLG-enabled analytics can personalize what the numbers actually mean. Unbiased explanations of insights that are communicated with a rich, nuanced understanding of your organization and people is invaluable.

NLG and text-to-speech (TTS) are both growth areas with exciting potential for significant disruptive innovation to humanize numerous applications. Research firm Markets and Markets has valued the global TTS market at $1.3 billion and predicts that it will reach $3.03 billion by 2022, growing at a compound annual rate of 15.2%. TTS is already widely used in smartphones for navigation, personal assistant apps and in laptops and televisions to help blind people access programs and information. Digital transformation and additional machine learning applications will further contribute to the advancement of humanizing analytics.

Imagine the plethora of awesome analytics applications that are shelf-ware today because they are too difficult or overwhelming for the masses to use. Adding NLG to those apps may be the secret to building a true data-driven culture. Natural user interfaces do represent the next major disruption in computing.

About the Author(s)

Jen Underwood

Impact Analytix

Jen Underwood, founder of Impact Analytix, LLC, is a recognized analytics industry expert. She has a unique blend of product management, design and over 20 years of "hands-on" development of data warehouses, reporting, visualization and advanced analytics solutions. In addition to keeping a constant pulse on industry trends, she enjoys digging into oceans of data. Jen is honored to be an IBM Analytics Insider, SAS contributor, former Tableau Zen Master, and active analytics community member.

In the past, Jen has held worldwide product management roles at Microsoft and served as a technical lead for system implementation firms. She has launched new analytics products and turned around failed projects. Today she provides industry thought leadership, advisory, strategy, and market research.

Jen has a Bachelor of Business Administration - Marketing, Cum Laude from the University of Wisconsin, Milwaukee and a post-graduate certificate in Computer Science - Data Mining from the University of California, San Diego.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights