Healthcare // Mobile & Wireless
Commentary
6/17/2014
12:00 PM
Alison Diana
Alison Diana
Commentary
Connect Directly
RSS
E-Mail
50%
50%

When Is Anonymous Data Really Anonymous?

Confusion over the standards for anonymous data, or deidentification of data, damages adoption of mHealth apps, curtails innovation, and could generate needless extra laws.

Technology Declares War On Cancer
Technology Declares War On Cancer
(Click image for larger view and slideshow.)

Peer closely through healthcare apps' terms of service and almost invariably you'll discover the information you share ultimately could be "deidentified" or "anonymized."

By reducing your data to one record among many, developers then can share the resulting database with researchers, pharmaceutical companies, government agencies, or anyone else interested in buying this information. What's the harm? Perhaps, hidden among what you ate for breakfast, the hours you slept, or miles you walked, is a cure for cancer. Of course, workout clothing designers, sneaker manufacturers, and granola bar makers could want this information, too.

While the Office for Civil Rights provides guidance on deidentification, there's no way to measure whether app developers abide by these recommendations, says Daniel Castro, director of the Center for Data Innovation, in an interview. Some simply remove users' names from databases before selling them, and call them deidentified, he said. Keeping all other information intact -- such as gender, age, and ZIP code, for example -- could allow another organization or individual to determine a patient's identity.

"A lot of organizations haven't thought too deeply about how to deidentify data. They'll strip out [some] data and that's that," he says. "That's not deidentified data. The government has a really important role to play here. It could really work on developing best practices in this area."

But government has been slow to promote deidentification at all -- and that's a big hurdle that limits healthcare advances, stymies innovation, and is overly protective, given the country's existing patient health information (PHI) privacy rules, according to the Center for Data Innovation. Without patient-created data from apps, wearables, and other sources, the analytics engine will go unfueled. Healthcare providers, payers, and researchers will solely access clinical or artificial data, as they remain locked away from consumers' own information, says Castro. That eliminates too much valuable, honest data from healthcare's datasets, he says.

(Image: Derrick Jones/Flickr)
(Image: Derrick Jones/Flickr)

On Monday, the center released a whitepaper, "Setting the Record Straight: Deidentification Does Work," designed to promote the safe use of deidentified data in healthcare and other markets. "We're really hoping it changes the conversation in Washington about deidentification," says Castro.

While several early studies showed how easily deidentified data was reidentified, standards or mandatory guidelines would prevent organizations from taking shortcuts or empower consumers to avoid developers that don't adhere to deidentification best practices.

To ensure patient privacy, consumers' records must be safeguarded from being uniquely identified and cannot be linked to another database that includes personally identifiable information.

To meet HIPAA requirements for using deidentified data, organizations must modify or remove 17 elements, the Center for Data Innovation wrote. For example, birth dates can only include the year -- no month or day, and only the first three digits of a ZIP code can be shared if the population is greater than 20,000 (or changed to 000 if it's a smaller populace).

Laws such as HIPAA and the Safe Harbor Act protect patients from any harm due to a breach of health information, said Castro. Despite rumblings of concern the White House Report on Big Data generated,

Alison Diana has written about technology and business for more than 20 years. She was editor, contributors, at Internet Evolution; editor-in-chief of 21st Century IT; and managing editor, sections, at CRN. She has also written for eWeek, Baseline Magazine, Redmond Channel ... View Full Bio
Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
<<   <   Page 2 / 2
jagibbons
50%
50%
jagibbons,
User Rank: Ninja
6/17/2014 | 1:32:07 PM
Re: Security
I like that idea, Whoopty. If certain data can be expired and expunged, that data can't be used as part of a painting a full picture of who each of us is.
Whoopty
50%
50%
Whoopty,
User Rank: Ninja
6/17/2014 | 12:18:20 PM
Security
While I wouldn't necessarily say it's the job of legislation to fix, I'm more worried about the data being stolen than I am about it being sold. While corporations are unscrupulous at times, as you've said they're legally bound to not sell on certain details - but that doesn't matter if someone hacks in and makes a copy.

What about a mandatory maximum length of time for storage of any personal data? 
<<   <   Page 2 / 2
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Must Reads Oct. 21, 2014
InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
A roundup of the top stories and trends on InformationWeek.com
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.