Peer closely through healthcare apps' terms of service and almost invariably you'll discover the information you share ultimately could be "deidentified" or "anonymized."
By reducing your data to one record among many, developers then can share the resulting database with researchers, pharmaceutical companies, government agencies, or anyone else interested in buying this information. What's the harm? Perhaps, hidden among what you ate for breakfast, the hours you slept, or miles you walked, is a cure for cancer. Of course, workout clothing designers, sneaker manufacturers, and granola bar makers could want this information, too.
While the Office for Civil Rights provides guidance on deidentification, there's no way to measure whether app developers abide by these recommendations, says Daniel Castro, director of the Center for Data Innovation, in an interview. Some simply remove users' names from databases before selling them, and call them deidentified, he said. Keeping all other information intact -- such as gender, age, and ZIP code, for example -- could allow another organization or individual to determine a patient's identity.
"A lot of organizations haven't thought too deeply about how to deidentify data. They'll strip out [some] data and that's that," he says. "That's not deidentified data. The government has a really important role to play here. It could really work on developing best practices in this area."
But government has been slow to promote deidentification at all -- and that's a big hurdle that limits healthcare advances, stymies innovation, and is overly protective, given the country's existing patient health information (PHI) privacy rules, according to the Center for Data Innovation. Without patient-created data from apps, wearables, and other sources, the analytics engine will go unfueled. Healthcare providers, payers, and researchers will solely access clinical or artificial data, as they remain locked away from consumers' own information, says Castro. That eliminates too much valuable, honest data from healthcare's datasets, he says.
On Monday, the center released a whitepaper, "Setting the Record Straight: Deidentification Does Work," designed to promote the safe use of deidentified data in healthcare and other markets. "We're really hoping it changes the conversation in Washington about deidentification," says Castro.
While several early studies showed how easily deidentified data was reidentified, standards or mandatory guidelines would prevent organizations from taking shortcuts or empower consumers to avoid developers that don't adhere to deidentification best practices.
To ensure patient privacy, consumers' records must be safeguarded from being uniquely identified and cannot be linked to another database that includes personally identifiable information.
To meet HIPAA requirements for using deidentified data, organizations must modify or remove 17 elements, the Center for Data Innovation wrote. For example, birth dates can only include the year -- no month or day, and only the first three digits of a ZIP code can be shared if the population is greater than 20,000 (or changed to 000 if it's a smaller populace).
Laws such as HIPAA and the Safe Harbor Act protect patients from any harm due to a breach of health information, said Castro. Despite rumblings of concern the White House Report on Big Data generated,
That's because organizations are trying to sell to, not harm, consumers -- and due to the many existing laws that protect individuals, their freedoms, and privacy, he adds.
"It's feasible -- since wearable technology using sensors is in people's phones, cars, or homes -- you'll be able to figure out something about somebody's health condition," says Castro. "It's more likely your neighbor knows already. Even if [app developers] could make an educated guess they're not going to sell that information to employers because that information is prohibited by law. The concern is there: It's a legitimate concern, that somebody will make an adverse decision because of healthcare, but more realistically it'll be the result of interactions. That's more likely to be the cause of one of these adverse actions than wearing a Fitbit."
Indeed, the advent of wearables and mHealth apps does not require a separate flurry of legislation, he says. Existing laws, perhaps with some amendments, adequately protect patients and their information, the Center wrote in a letter to the Federal Trade Commission this month. In addition to HIPAA, laws such as the Americans with Disabilities Act (ADA), Genetic Information Nondiscrimination Act (GINA), Fair Credit Reporting Act (FCRA), and Employee Retirement Income Security Act (ERISA) protect consumers and patients, the organization said.
With more participants from across the wellness spectrum, mHealth data could lead to vital insight, treatments, and perhaps cures. Until all app developers speak the same language and operate under the same rules, some patients could find deidentification and anonymity hurdles too high to clear.
Most IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift. Get the 6 Tools To Protect Big Data report today (registration required).Alison Diana is an experienced technology, business and broadband editor and reporter. She has covered topics from artificial intelligence and smart homes to satellites and fiber optic cable, diversity and bullying in the workplace to measuring ROI and customer experience. An ... View Full Bio