Internet Of Things: Current Privacy Policies Don't Work
Traditional ways to deliver privacy guidelines, such as online postings or click-through mechanisms, don't work with the Internet of Things.
The Internet of Things has gone mainstream. Consumers can use devices to control things in their houses from appliances to pet-food dispensers. Applications on mobile devices can measure how far and how fast the wearer has run or walked and can track heart rate and blood pressure. Connected sensors and devices, and their potential uses, are proliferating.
But discussions about the data created are far more likely to focus on how to use the data rather than how to protect it. While devices and applications are generally designed and implemented with data protection in mind, that is unlikely to be enough. Developers and users must consider the broader implications for individual privacy as vast amounts of information -- about health, browsing history, purchasing habits, social and religious preferences, and finances, among other things -- accumulates.
The crucial question for the owner of the app or the device is whether data collection is limited to an identified purpose. The crucial question for users is whether they can determine when, how, and to what extent their information is communicated to others.
Traditional privacy notions rely upon the Fair Information Practice Principles. While we can certainly look to FIPPs for guidance, they can't adequately address the issues posed by the Internet of Things because the traditional ways to deliver privacy guidelines -- posting them online, mailing them, and online click-through mechanisms -- don't really work with IoT. Today, the data being collected on these devices is largely invisible to us. For instance, a driver behind the wheel has little discretion over traffic sensors that transmit speed, license-plate numbers, and location. Given the number of devices transmitting information during the course of a day, requiring notice and choice quickly becomes unwieldy, and innovation suffers.
The Federal Trade Commission has taken the position that developers of devices and apps must consider both use and collection restrictions and, in conjunction with the development of devices and apps, consider privacy by design, simplified choice, and transparency -- i.e., address the privacy issues by incorporating the core principles of FIPPs.
That FTC expectation is difficult to meet in practice. Use restrictions are generally ineffective because they depend upon self-enforcement or third-party enforcement, and confidential information can't be retrieved once it is released. The same is true of collection restrictions: It is impossible to monitor every device to confirm that the data being collected is consistent with the purpose intended. Enforcement is complicated because there are multiple players -- the manufacturer, service networks, advertisers, and carriers, to start with -- and only the most egregious offenders are likely to attract regulators' attention.
Developers and users can address some of the questions that the IoT raises by studying new approaches to protecting privacy, starting with those that account for the continuous communication of individual information. New approaches to IoT privacy should:
Clearly and completely state the purpose for collection and the related context, including potential benefits to the individual. Collecting personal health data such as pulse rate, blood pressure, activity, and other vital statistics might be expected when it is being transmitted to an individual's healthcare provider which, in turn, may lead to more efficient medical and health treatment. But individuals may not know that a fitness bracelet or a mobile phone app that transmits such data might also be used to market other products or medications to them.
Make personally identifiable data anonymous whenever possible in ways that prevent re-identification, so that users don't need to be concerned about the nature and use of data gathered by IoT devices.
Explain the criteria used to gather and retain data, and communicate whether data is being retained to improve products, enhance further research, enhance security, etc. This sounds easy in the context of traditional privacy notices, but it may be problematic with products like Google Glass that gather data from all kinds of sources in the surrounding environment, making it impossible to state with specificity what is being gathered and retained by the user. Finding ways to keep personally identifiable data anonymous with these types of devices may be the way to address this problem.
Monitor data transmissions so that misuses can be blocked or trigger notices to affected users.
Provide users reasonable access to their personally identifiable information, and give them the ability to change or correct it.
The traditional privacy notice did not conceive of an Internet of Things. As the number of connected devices expands, the data collected will undoubtedly yield social benefits. However, the challenge will be finding a privacy paradigm that respects individual rights and accommodates choice and makes sure that the social benefits don't come at the cost of individual privacy. Progress won't wait for us to develop new ways to deal with this challenge, which is why we must give serious consideration to new approaches now.
InformationWeek's new Must Reads is a compendium of our best recent coverage of the Internet of Things. Find out the way in which an aging workforce will drive progress on the Internet of Things, why the IoT isn't as scary as some folks seem to think, how connected machines will change the supply chain, and more. (Free registration required.)
Marc Loewenthal is Director at Promontory Financial Group, where he advises clients on governance risk and compliance matters with a particular emphasis on privacy and information security. His areas of expertise include advice on privacy governance and privacy management, ... View Full Bio