Impact of Wearables and IoT on Privacy
Apple CEO Tim Cook's recent defense of privacy and encryption highlights an issue of paramount important to today's IT leaders: Data privacy. IT leaders may debate the scope of their involvement with security and privacy issues, as they did at the recent MIT Sloan CIO Symposium. But data privacy, a perennial problem, has become critical for every organization as we fit every one of our things for network connectivity.
Though wearables and the Internet of Things (IoT) share similar privacy concerns to that of mobile devices, the nature of continuous, on-the-body, or ubiquitous environmental sensing amplifies the privacy issues. Ambient computing, or an ecosystem of wearables and IoT on or near a user, has the ability to acquire intelligence about a user and his or her surroundings by sensing, processing, and communicating data to infer the user's context, external stimuli, behavior, and intent.
This article highlights the specific privacy concerns related to wearables and IoT, and gives you best practices for consumer privacy protection.
[This is not a test. See IoT Market Will Grow 19 Percent in 2015, IDC Predicts.]
According to Motti and Caine's study, "Users' Privacy Concerns About Wearables: Impact of Form Factor, Sensors, and Type of Data Collected," the privacy concerns about wearables are similar, but in some cases more specific, than privacy concerns about mobile devices. It also shows that users are aware of the potential privacy implications, particularly during data collection and sharing. Users' privacy concerns are related to the ability of the wearable device to sense, collect, and store data that are often private, personal, or sensitive, and then share these data with unknown or unethical parties.
The study specifically highlights the following privacy concerns:
Social implications and the lack of awareness of the impact on the privacy of others: Devices may not only record a user's activity, but also record the activities of those around the user.
"Right to forget": Users fear that when certain data are combined, they could have serious personal implications; users therefore want the data collected -- with or without user consent or awareness -- to be deleted.
Implications of location disclosure: Users are concerned that their GPS location may be made available to malicious parties and criminals.
Discrete display of confidential information: Confidential information displayed on smart watches may be viewable to other parties nearby.
Lack of access control: Users fear that organizations and the government may use their personal data without their awareness or consent.
Surveillance and sousveillance: Users fear continuous surveillance and sousveillance, not only as a matter of personal privacy, but also in light of the potential for criminal abuse.
Privacy concerns for head-mounted devices: Users are concerned that head-mounted display (HMD) computers with cameras and microphones may impact their privacy and the privacy of others.
Speech disclosure: Users express concerns about their speech being overheard or recorded by others.
Surreptitious audio and video recording: Users are concerned that wearables with camera and audio input may record them discreetly without their knowledge.
Facial recognition: Users are concerned that systems may recognize and identify them individually.
Automatic synchronization with social media: Some users do not like the idea of their devices immediately synchronizing with social media applications and sharing their data without being able to control this sharing.
Visual occlusion: Head-mounted displays that cover the user's field of view disrupt the user's ability to interact privately because vision is blocked.
According to PwC's report "Consumer Intelligence Series: The Wearable Technology Future," 82% of respondents in the survey indicated that they are worried that wearable technology would invade their privacy. Eighty-six percent expressed concern that wearables would make them more prone to security breaches.
Pew Research Center study "Americans' Privacy Perceptions and Behaviors" found that consumers lack confidence in their control over their personal information. Moreover, they are concerned about surveillance by companies and the government. Nintey-one percent agree or strongly agree that consumers have lost control over how personal information is collected and used by companies. Eighty-eight percent believe that it would be difficult to remove inaccurate information about them online. Eighty percent of those who use social networking sites say they are concerned about third parties accessing their data.
On the legislative front, Congress and some federal agencies are investigating the practices of third-party consumer data collectors. The FTC has recommended that Congress pass a law giving consumers the right to have access to their personal data compiled by data brokers. Regulators may require data resellers to periodically provide consumers with free data reports.
The panel discussion Data Privacy Trends 2015, facilitated by the Churchill Club, takes a serious look at user concerns about privacy.
Wearables are becoming more intimate than ever before. Medical wearables startup Quanttus claims that its wristband collects 50 million unique data points and more than 400,000 vital sign measurements per person per day.
Obviously, this much data in the hands of a private company raises eyebrows -- and concerns. The patent damages, litigation risks of wearable and IoT data, and legal issues related to privacy and data ownership are yet unknown.
As we shift from opt-outs and unilateral privacy policies to consumer empowerment and data rights, what privacy framework should businesses and governments use? Amyx McKinsey's "Wearables and IoT Privacy Playbook" provides the necessary framework that not only helps consumers, but also keeps companies and governments one step ahead of the game.
Privacy should not be a stop-gap measure to fend off bad press after a data breach. Rather, privacy itself is the market differentiation and product benefit. Exceptional leaders in wearables and IoT will choose to lead with privacy. Why? It's not simply because they don't want to be perceived as a ticking bomb, but rather because a great leader truly has customers' best interests at heart. Established visionaries voluntarily comply -- and in some cases, exceed -- industry standards to demonstrate their understanding of their customers' needs. When consumers feel that you have their best interests in mind, they will seek out your company's products or services repeatedly.
Besides the obvious bonus of protecting your customers in the digital realm, not leading with privacy can be costly. The list of Top 20 Government-imposed Data Privacy Fines Worldwide from 1999-2014 is a harsh reminder that failing to take consumer interests to heart can be damaging to your bottom line.
Nothing communicates to the market and your customers that you are serious about privacy than establishing a Chief Privacy Officer (CPO) at the senior executive level of your organization. The CPO is responsible for managing the risks and business impacts of privacy laws and policies, including personal data, quantified self data, medical data, financial information, and laws and regulations such as HIPAA, the Fair Credit Reporting Act, and the Gramm-Leach-Bliley Act.
The best way to approach privacy is to attack it from design. Patch-work after a system is already implemented is difficult and costly. Privacy by Design (PbD) is a framework that takes privacy into account throughout the entire product development and management process. The 7 Foundational Principles of PbD are:
The Digital Advertising Alliance (DAA), a self-regulatory group comprised of advertising and media companies, publishes a number of privacy guidelines on topics such as Self-Regulatory Principles for the Mobile Environment, Online Behavioral Advertising, and Multi-Site Data. The guidelines for the Self-Regulatory Principles for the Mobile Environment establish notice and consent requirements and options for cross-app data, precise location data, and personal directory data. Personal data includes calendar, address book, phone and text log, or photo and video data created by a consumer that is stored on or accessed through a particular device.
The Network Advertising Initiative (NAI), a leading self-regulatory association dedicated to responsible data collection and its use for digital advertising, coordinates with the DAA on best practices and guidelines for online and mobile environments.
Open source standards bodies, such as the AllSeen Alliance, Open Interconnect Consortium, Industrial Internet Consortium, OASIS, and Eclipse Foundation, are working on standardization, reference implementations, and certification programs for wearables and the Internet of Things. These standards help ensure interoperability among products across verticals, regardless of the manufacturer.
In order to support the widespread adoption of products, systems, and services that support wearables and IoT with an open, universal development framework, open source consortiums and alliances are tackling privacy and security at the standards level.
Forrester regularly updates its "Personal Identity And Data Management Playbook" (PIDM) that addresses the tools, technologies, responsibilities, and requirements privacy professionals will need to adopt to build trust in relationships.
The PIDM Playbook outlines its best practices:
In Forrester's "Personal Identity And Data Management Success Starts With Customer Understanding," the ways that consumers view, protect, and value their data, as well as their willingness to share data, and their motivations varies widely by consumer age and by the type of personal data involved.
What does the Forrester study tell us about consumers?
Disclosures should take a drill-down approach. At the highest level, it should be very easy for consumers to understand. Facebook, LinkedIn, and others are moving to a simplified icon and shorter text approach. Facebook rolled out its Privacy Checkup tool last year to its more than 1.2 billion users to help them easily understand their current privacy settings and change them as needed.
In a cascading approach, disclosures should drill down further to unveil more granular information. At the lowest level, all the required legalese should be present for full disclosure.
Data anonymization is the process of encrypting or removing personally identifiable information so that the data remains anonymous. Some of the techniques include converting text data into a non-readable form by using pre-image resistant hashes and discarding decryption keys.
Anonymized data in a medical context refers to data from which the patient cannot be identified. A patient's personal data is stripped out, in addition to any other potential information that could identify the individual.
According to Hoang Bao, director of policy, privacy, and data governance at Yahoo, "anonymization is a hotly discussed topic among privacy professionals. There are some prolific studies about how data can be de-anonymized (in which anonymous data is cross-referenced with other data sources to re-identify the anonymous data source), such as Latanya Sweeney's study on how 87% of the US population can be uniquely identified by gender, zip code, and full date of birth, or Arvind Narayanan's work on 33 Bits of Entropy, which showed that re-identification can happen with just 33 pieces of random information."
Bao indicates that "data anonymization should be considered holistically, in conjunction with all other privacy best practices, to provide end users with a robust privacy protection framework."
Another consideration is to collect and analyze data patterns as an aggregate versus individual data. Examples of aggregate data collection might be crowd management and general sentiment analyses at airports, shopping malls, sports stadiums, public spaces, and live events. These data, of course, would not provide information that would identify any single individual.
Data minimization is a concept that companies should limit the data they collect, store, and share to the minimal data necessary to perform a task and then dispose of the data once it is no longer needed.
By reducing the amount of data exchanged, data minimization helps reduce the amount of data that can be misused or leaked. Large data stores are attractive to hackers and increase the probability of an attack. Additionally, as more data is collected and stored, the risk skyrockets that the data will be used in a way that departs from consumers' expectations.
Therefore, companies need to carefully evaluate their data needs and develop privacy policies that set reasonable limits on the collection and retention of consumer data.
Unlike persistent data, ephemeral data is only temporary or transitory; it exists only briefly before it is deleted.
Ephemeral apps such as Snapchat have spurred a whole new genre of self-destructing apps, including Wickr, Frankly, Confide, ArmorText, Hash, and Pluto Mail. The allure of these services is that your data will not be archived and searchable via Google -- or worse, subject to the prying eyes of Big Brother. Human rights activists, lawyers, and journalists have flocked to ephemeral apps.
The appeal of ephemeral data is real, but practically speaking, true deletion is difficult due to the complexity of data flow and environments. The SEC targeted SnapChat's claim that its service is ephemeral. Snapchat's photos and video are not deleted and even remain on the app user's device. Instead, Snapchat applies an encryption that tells an operating system to ignore the data. Wickr touts its military-grade encryption software that acts as a virtual secure shredder and claims the software makes it impossible for data to be recovered. Can data really ever be completely wiped out? The debate continues, and how your company deals with data can either make customers flock to you….or make them flee.
The "Right to be Forgotten" means consumers have the right to have certain data, videos, or photographs about themselves deleted from Internet records so the data cannot be found by search engines and third parties.
There is a distinction between the "right to forget" and the "right to privacy"; the "right to privacy" covers information that is not publicly known, whereas the "right to forget" involves removing information that was publicly known at one time and not allowing third parties to access the information.
The European Commission drafted a European Data Protection Regulation to include specific protections in the "right to be forgotten," which supersedes Article 12 of Directive 95/46/EC that gave a legal basis to individuals in regards to the Internet.
Forbes columnist Joseph Steinberg advocated for the need of legislation guaranteeing the "right to be forgotten" and noted that "existing laws that require adverse information to be removed from credit reports after a period of time and that allow the sealing or expunging of criminal records, are effectively undermined by the ability of prospective lenders or employers to forever find the removed information in a matter of seconds by doing a Google search."
According to a 2015 poll by Intelligence Squared U.S., 56% of those that voted indicated the US should adopt the "Right to be Forgotten" online.
According to Forrester report "Personal Identity Management Success Starts With Customer Understanding," consumers view various types of data differently. They are most concerned about their personal identity data, such as their social security number, birthdate, and address. They are far less concerned about behavioral data from the Internet, wearables, and smartphones.
Consumers compartmentalize different data. Financial data is housed and managed by their financial services providers. Health data is entrusted to their hospital and medical professionals. Shopping information is shared with a particular retail entity at the time of the transaction. This shows that privacy opt-in/out should not take an all-or-nothing approach, but should be thoughtfully crafted based on the type of data collected and shared.
There is mounting pressure from the FTC to have Congress pass a law giving consumers the right to have some access to the records data brokers compile about them. Regulators may soon require data resellers to periodically provide consumers with free data reports.
Personal data service and identity management system providers such as Personal.com and Reputation.com have created data vault products to enable users to see or share sensitive data and all the files they store in their data vault. Personal.com securely creates a private network, allowing registered users to share access to data and files through an exchange of encrypted keys without the risk of transmitting the data or files through non-secure, direct means. It also allows users to immediately update data across their own network and revoke access to it when they choose. Reputation.com collects data about consumers' marketing preferences and gives them the option to share information on a limited basis with certain companies in exchange for coupons and status upgrades.
The industry is pushing for data vaults to be made available to consumers voluntarily by data brokers and third parties to give consumers granular preference management for the type of data, how it's stored, and how it's used instead of making these types of options an all-or-nothing proposition. Bao suggests, "Data collection and usage should fuel the business model, as well as protect consumers. A product should strive to find that sweet spot of giving consumers meaningful notice and choice(s) over their data without limiting the core functionality of the product."
Kaiser Permanente voluntarily provides a data vault to its members to manage information about their healthcare, prescriptions, and insurance, as well as allowing them the ability to control access to their data vault to help manage their care.
The fear of data compromise is driving the phenomenon of local data storage in wearables and connected homes. These new storage solutions promise all the benefits of the public cloud without sending your data to the cloud through an Internet connection. Users of local data storage can securely access, stream, auto-backup, and auto-sync their files across all their wearables and smart-home devices at the speed of a local network, using WiFi and Bluetooth Low Energy (BLE). The ReVault smartwatch, for instance, promises wireless, wearable local storage on your wrist.
At the heart of this momentum is the home automation hub. Ninja Sphere is a home controller hub that enables homeowners to monitor the home's temperature, lighting, and energy usage, as well as a pet's presence or anything else connected to the sphere. It boasts capabilities to handle processing offline for faster performance and offers added privacy and security. The collected home data from devices, the environment, and location reside locally on the hub to give users full control over their data. The home automation hub category is driving new competition from SmartThings, Icontrol Networks, Revolv (a Nest Company), and Lutron. EMC, Western Digital, SanDisk, and Cisco will target the home as the new personal data center for those who choose to have complete control of their data.
When it comes to wearables and IoT, the nature of users' concerns is critical. Since most users are somewhat unaware of potential privacy implications, companies should alert them to possible concerns. Innovators will enable their customers to control the type and frequency of data collection, the accessibility of that data, and how the data are shared.
In short, lead with privacy ... or be dragged down by crushing fines and fleeing customers.Scott Amyx is the founder and CEO of Amyx+McKinsey, a wearables strategy agency specializing in smart wearables strategy and development. He writes for InformationWeek, Wired.com, IEEE Consumer Electronics Magazine, andIEEE Technology and Society Magazine, and he ... View Full Bio