October 15, 2014
Rarely is government far ahead of the technology sector in cutting-edge policies designed to produce better results. Surprisingly enough, that is exactly what is happening with techniques that empower citizens to make optimal decisions related to economics, resource allocation, and privacy.
"Choice architecture" -- influencing outcomes by the way a decision is presented to an individual -- has been incorporated, mostly to the benefit of citizens, into public and private-sector policy making around the globe. Conversely, choice architecture is often employed by the tech sector to the detriment of users by burdening the consumer with the choice of "opting out" of data collection.
[Watching the watchers: Twitter Sues US Government Over Surveillance.]
Private employers that automatically enroll employees in the company's 401(k) savings plan, placing the burden on individual employees to "opt out" of it, produce higher participation and savings rates. Similarly, states such as California and Texas have adopted the "presumed consent" organ donor approach when an individual applies for a driver's license. The application form contains a donor program participation box that is already checked; an applicant would need to make the effort to "opt out" of the program by checking the other box on the form. Research shows that states enlisting the presumed consent approach have almost twice the donor rate of states that do not. In a broader context, international governments continue to experiment with choice architecture to produce preferred outcomes for the public good in the areas of healthcare, transportation, and housing.
Technology privacy and opting out
Applying this to the tech sector, the burden of "opting out" is often placed on the consumer. Take marketing emails, for example: We have all faced the task of navigating an online marketing program to unsubscribe our email address with varying levels of success. More concerning are the privacy implications. Some technology providers fail to make opting out of data sharing easy to understand. Unknown to many users, the default setting of many of our technology applications is to automatically opt in, which essentially allows user data to be consumed in ways that are advantageous to the provider, but not the user.
A recent example came to light with the hacking of celebrity images from iPhones and servers. This led to the public realization that Apple's default setting is to save all images in the cloud automatically, which, when combined with the lack of two-factor authentication at the time, made them easier for outside parties to access. In another example, until recently Google's student product suite -- Google Apps for Education (GAFE) -- was mining data from students' email to target ads. Google then reversed course and, following an outcry from ed-tech and privacy experts, announced that it would end this practice. Though this type of scanning has been eliminated, Google has yet to address whether it is still data mining and creating user profiles for other business purposes. With choice architecture in mind, shouldn't it be a student's choice not to be exposed to this practice?
Technology providers' new responsibilities
All technology users should demand more protections for their privacy. This means providers employing choice architecture for the benefit of users -- not the providers. Specifically, they can frame data privacy decisions in two key ways that maximize transparency and ease of understanding:
All default settings should automatically exclude the user from participating and place the burden on the individual to opt in and partake in data sharing. Framing the data-sharing decision for an individual in this way would be a 180-degree turnaround for most technology providers. But it wouldn't require much more effort from users while producing notable improvements in the privacy protections of their data.
Where a provider chooses to have a default setting of a user's unspecified consent, the process to opt out of data sharing should be easy to understand and effortless to navigate. Fine print, difficult-to-locate links, and cumbersome steps are sometimes intended to prevent or discourage a user from opting out of a program. Conversely, Pinterest recently sent an email to registered users explaining the changes in data use to display ads. In two sentences, the email instructed users to change their account settings by following the included link if they want to opt out of using information from advertisers.
These changes are unlikely to happen overnight. But as consumers become more informed about how their data is used without their permission, they can and should expect more from the technology providers that offer us applications and services on a daily basis.
The owners of electronic health records aren't necessarily the patients. How much control should they have? Get the new Who Owns Patient Data? issue of InformationWeek Healthcare today.
About the Author(s)
You May Also Like