Confusion over the standards for anonymous data, or deidentification of data, damages adoption of mHealth apps, curtails innovation, and could generate needless extra laws.
Technology Declares War On Cancer
(Click image for larger view and slideshow.)
Peer closely through healthcare apps' terms of service and almost invariably you'll discover the information you share ultimately could be "deidentified" or "anonymized."
By reducing your data to one record among many, developers then can share the resulting database with researchers, pharmaceutical companies, government agencies, or anyone else interested in buying this information. What's the harm? Perhaps, hidden among what you ate for breakfast, the hours you slept, or miles you walked, is a cure for cancer. Of course, workout clothing designers, sneaker manufacturers, and granola bar makers could want this information, too.
While the Office for Civil Rights provides guidance on deidentification, there's no way to measure whether app developers abide by these recommendations, says Daniel Castro, director of the Center for Data Innovation, in an interview. Some simply remove users' names from databases before selling them, and call them deidentified, he said. Keeping all other information intact -- such as gender, age, and ZIP code, for example -- could allow another organization or individual to determine a patient's identity.
"A lot of organizations haven't thought too deeply about how to deidentify data. They'll strip out [some] data and that's that," he says. "That's not deidentified data. The government has a really important role to play here. It could really work on developing best practices in this area."
But government has been slow to promote deidentification at all -- and that's a big hurdle that limits healthcare advances, stymies innovation, and is overly protective, given the country's existing patient health information (PHI) privacy rules, according to the Center for Data Innovation. Without patient-created data from apps, wearables, and other sources, the analytics engine will go unfueled. Healthcare providers, payers, and researchers will solely access clinical or artificial data, as they remain locked away from consumers' own information, says Castro. That eliminates too much valuable, honest data from healthcare's datasets, he says.
On Monday, the center released a whitepaper, "Setting the Record Straight: Deidentification Does Work," designed to promote the safe use of deidentified data in healthcare and other markets. "We're really hoping it changes the conversation in Washington about deidentification," says Castro.
While several early studies showed how easily deidentified data was reidentified, standards or mandatory guidelines would prevent organizations from taking shortcuts or empower consumers to avoid developers that don't adhere to deidentification best practices.
To ensure patient privacy, consumers' records must be safeguarded from being uniquely identified and cannot be linked to another database that includes personally identifiable information.
To meet HIPAA requirements for using deidentified data, organizations must modify or remove 17 elements, the Center for Data Innovation wrote. For example, birth dates can only include the year -- no month or day, and only the first three digits of a ZIP code can be shared if the population is greater than 20,000 (or changed to 000 if it's a smaller populace).
Laws such as HIPAA and the Safe Harbor Act protect patients from any harm due to a breach of health information, said Castro. Despite rumblings of concern the White House Report on Big Data generated,
Alison Diana has written about technology and business for more than 20 years. She was editor, contributors, at Internet Evolution; editor-in-chief of 21st Century IT; and managing editor, sections, at CRN. She has also written for eWeek, Baseline Magazine, Redmond Channel ... View Full Bio