Siri, Cortana Are Listening: How 5 Digital Assistants Use Your Data
Learn more about how digital assistants including Amazon Alexa, Facebook M, Google Now, and Apple's Siri are rewriting the rules around data privacy and sharing.
![](https://eu-images.contentstack.com/v3/assets/blt69509c9116440be8/blt20bee13a5be1e057/64cb44705b892801f133815f/Talking_iStock_000025124526_Large.jpg?width=700&auto=webp&quality=80&disable=upscale)
In August, Facebook introduced its new digital assistant, M, to a limited number of users, whose response was unlike anything Siri or Google Now had faced. People were flummoxed over whether M was actually an artificial intelligence (AI) win for Facebook or if there was a human on the other side of the screen.
The answer: Both.
A tremendous amount of learning is involved in AI, and Facebook's new baby was taking its first wobbly steps, with adult supervision. A Wired profile explained that M used AI to field the initial response to every question, but that a person approved or adjusted every answer before it went out. With every adjustment or implied thumbs up, M learned and got a little better at answering on its own.
There are a few key things happening that speak to the bigger picture of big data and analytics today, when a technology like M -- or Cortana or Alexa -- can do something like send your mom a bouquet on Mother's Day.
One is that the software is learning processes and finding out how to make connections. Because thousands of other people have made the same requests, it's already made corrections and figured out efficiencies that have nothing to do with any individual user, but rather the learning that can come from the enormous datasets being created when thousands or millions of people are contributing data points.
[See 10 Productivity Hacks to Kick-Start Your Day.]
Another key thing that these assistants are learning about each of us is how to better sell to us.
Still another is that we consumers are getting increasingly used to the idea of sharing data, like our credit card number, our mom's name and address, and many other data points that we may or may not realize can be of value to an algorithm and a company like Facebook or Apple.
That willingness, along with our understanding of the larger value we receive in return, will drive a similar and inevitable shift within enterprises, as each of us in our professional roles needs difficult questions answered, such as knowing whether we'll meet a sales forecast, or tasks accomplished, such as thwarting hackers.
Enterprises, particularly in regulated industries like government and healthcare, are increasingly understanding and embracing the benefits that come with sharing data and contributing data -- in secure ways -- to create larger datasets that can reveal critical and otherwise unavailable insights.
This first wave of digital assistants may be all it takes to warm consumers to the concept and push more enterprises beyond legacy, pre-Internet thinking about data, how it should be treated, and what it can make possible.
In the following pages we take a look at digital assistants from five of the largest tech players and how they're using your data to perfect their AI. Take a look and let us know your thoughts in the comments section below.
Rising stars wanted. Are you an IT professional under age 30 who's making a major contribution to the field? Do you know someone who fits that description? Submit your entry now for InformationWeek's Pearl Award. Full details and a submission form can be found here.
In the right mood, Siri will respond to above question with: "I'm sorry, but I can't answer that."
A look at Apple's Terms & Conditions, however, clarifies that everything you say to Siri is transcribed into text in order to process your requests. Apple also collects information about names and nicknames in your address book, songs and names in your music collection, and other data -- collectively known as your user data -- in order "to help Siri and dictation understand you and better recognize what you say."
Yet, this data "is not linked to other data that Apple may have from your use of other Apple services," Apple clarifies.
By using Siri, Apple adds, you agree to allow Apple and its subsidiaries and agents to transmit, collect, maintain, process, and use your voice input and user data.
Oddly, the Apple Agreement for iOS 9 doesn't include the mentions to Siri or dictation as those for earlier versions of Apple iOS do.
Amazon perfected its voice capabilities with the Amazon Dash and Fire TV. Then it introduced Echo, a minimalist, cylindrical speaker with a cloud-based service that goes by Alexa. Say, "Hey, Alexa …" to get her attention. She can add items to an Amazon shopping cart, answer any question that the Internet can, and play music that's connected to the account you're signed in with (otherwise, she'll just play you a free snippet).
The Alexa Terms of Use state that Amazon saves the recordings of your voice that are sent to the cloud for processing by Alexa. However, by going into Settings and then History in the Alexa app, the recordings can be deleted.
In the Echo FAQs, Amazon explains that "when you use the wake word, the audio stream includes a fraction of a second of audio before the wake word." This suggests that Alexa is always listening, unless the Echo's microphone is manually turned off.
Apple's Siri also used to work like this, but with iOS 9 Apple made "off" the Siri default mode. Siri's always-listening capability can be selected in iPhone's Settings.
Google Now has arguably the least fun name of the five major assistants and undoubtedly the driest personality. While Siri's and Cortana's programmers anticipated and prepared for likely questions such as how much wood a woodchuck might chuck, Google Now plays it straight by offering up Internet pages.
Google Now is also always listening unless it's turned off. How to accomplish that depends on the device you're using.
While the other assistants are easy to engage, Google Now is good about proactively engaging users, based on predefined preferences and its ability to keep learning. Knowing your route to and from work, for example, it may alert you to traffic or an alternate route by using its pop-up "Cards." Or, it can ping you when you're walking past the grocery store, if you set a location-based alert to prevent you from walking by again without remembering to buy milk.
Cortana is something of a mashup of Siri and Google Now. She has a female voice, like Siri, and can field a variety of questions, from scientific to cheeky. You can also tell her about your interests and needs, and set location-based alerts. She can, for example, tell you the score when your favorite team is playing or remind you to call into the conference call the moment you arrive at work.
When you use Cortana, Microsoft says it "collects and uses" information, including your location, history, contacts, voice input, searching history, calendar details, the content and communication histories from messages and apps, and "other information on your device."
Cortana can be turned off, and information saved about a user can be cleared.
On Feb. 18, Microsoft made updates to Windows 10 that include making it easier to use Cortana to search for music on a Windows Phone device. Now, users can just tap the little box in the top corner with the musical notes.
There are several excellent accounts of early users trying out M, but there's oddly little about M on the Facebook site, its blog, or news pages. Even the page introducing M to select users in August seems to have been pulled. But what we do know is that M is something of a hybrid -- AI with human supervision.
Early users, trying for proof of whether humans were involved in real-time, asked M to perform all kinds of ridiculous things, which it did, because there are indeed humans involved. But the team behind M is ambitious and serious. In January 2015, Facebook acquired Wit.ai, an open source platform that takes spoken or written text and turns it into "actionable data." Or, a "Siri-like conversational interface," according to the LinkedIn profile of former Wit.ai CEO Alex Lebrun, who is in now in charge of the AI team teaching M.
Early users, trying for proof of whether humans were involved in real-time, asked M to perform all kinds of ridiculous things, which it did, because there are indeed humans involved. But the team behind M is ambitious and serious. In January 2015, Facebook acquired Wit.ai, an open source platform that takes spoken or written text and turns it into "actionable data." Or, a "Siri-like conversational interface," according to the LinkedIn profile of former Wit.ai CEO Alex Lebrun, who is in now in charge of the AI team teaching M.
-
About the Author(s)
You May Also Like