application FairWarning. The software tracks whether employees or clinicians have legitimate reasons for opening particular records.
Because the software records anyone who accesses a record, "you can't hide from the EHR anymore like you could with a paper chart," Rosenhagen says. He contends the EHR provides a more secure environment than paper records did.
Courion, which developed similar software, has clients such as Miami Children's Hospital, Quest Diagnostics, and Memorial Sloan Kettering Cancer Center. They use its software to automatically update hires and terminations to limit access to authorized personnel retrieving legitimate information, says Courion's Zannetos.
The right to patient records changes over time, creating an added challenge. Parents, for example, control access to children's EHRs -- while they're children.
"Once they turn 18 you've got to turn that off and give access to the 18-year-old, who's supposedly no longer a child," says Zannetos, father of two young adults. Zannetos says the healthcare industry needs to learn from the credit card industry, and how it considers a wide range of factors to spot fraud. "We have to constantly watch through the very complex connections between people, apps, access rights, and what they're doing, and raise alerts when things look like they're out the norm," he says.
IT departments also must guard against patients' errors. All too often consumers use the same password for multiple sites. If a breach occurs at an unrelated site, users might think their data is secure but cyberthieves could now have the password that protects their personal health information, Zannetos says. Organizations might want to enforce frequent password changes, require multicharacter passwords, or assign passwords to consumers, rather than allowing them to use their own creation.
Some providers have moved beyond portals and extend complete access to patients through the Open Notes initiative, says David Harlow, principal at The Harlow Group, a healthcare legal and consulting firm. In March, for example, WellSpan Health began offering patients access to office-visit notes, as well as lab results, physicals information, immunizations, and imaging studies.
"It means really sitting side by side with a patient in front of the computer screen, rather than having the computer screen between the doctor and patient, in order to share that information in real-time during the office visit," Harlow says. "It's a real culture change."
To promote access to all electronic records, regardless of providers' EHRs, the federal government and participating partners use Blue Button, a technology that lets consumers click on a blue link to view online, download, and share their records. Although not all providers participate today, HealthIT.gov claims the roster is expanding rapidly.
All for one, one for all
There is a point at which patients lose control of their data; that is when identifiable information is removed and organizations use the vast collection of health data for analytics.
"If it's de-identified, then it's not considered to be that patient's information anymore," Harlow says.
HIPAA recommends one formal process to de-identify data. It requires stripping out all potentially identifiable information, an approach that safeguards patients but deprives researchers, he says. Statistical de-identification, which uses techniques that allow inclusion of certain demographic points, is more valuable to researchers, Harlow adds. Optum Labs, for example, uses multiple de-identification steps when it receives data from Humedica and provides pockets of data to authorized researchers, says Paul Wallace, chief medical officer at Optum Labs. Both approaches satisfy HIPAA rules to preserve patient anonymity, although statistical deidentification -- while more useful -- is also more costly.
Organizations use this statistical de-identified patient data for everything from healthcare and provider quality control and treatment improvement, to researching new medicines and finding new relationships between disease cause and effect.
However, statistically de-identifying data isn't perfect. In a study last year, the Whitehead Institute for Biomedical Research, a nonprofit research and teaching institution with programs in cancer research, developmental biology, genetics, and genomics, was able to re-identify 50 people who had sent personal DNA data in genomics studies such as the 1000 Genomes Report. The odds of being named from a de-identified database were 4 in 10,000, according to a 2005 study. Since that year, consumers share more identifiable information via social media and apps, and more information is digitally available, so perhaps it's more likely to be identified today.
Rather than de-identify data, researchers should be held responsible for protecting personal data and privacy, recommends an article on the Association for Computing Machinery's website written by Jon P. Daries, Justin Reich, Jim Waldo, Elise Young, Jonathan Whittinghill, Daniel Thomas Seaton, Andrew Dean Ho, and Isaac Chuang. Although they focused on students in higher education, the authors argue de-identification forces changes to data that threaten analysis and weaken the results. Too much concern for de-identifying could stifle important research, they say.
Patients worried about their data being re-identified might lie to medical professionals, to hide alcohol, drug, or physical abuse, or conceal embarrassing symptoms. Others are concerned insurers or employers will combine readily available credit card information with health data to paint clear pictures about consumers' cigarette, fast food, or liquor purchases.
Although one person's information speaks solely to that individual's health, the records of an entire population paint a broader picture, one that holds clues to cures, treatments, and prevention. Consumers might generate their personal health data, but they don't own their records. If we're all to reap the benefits of that collective knowledge, it's up to organizations that steward this data to protect it from those that seek to use it for illegal, unethical, or harmful purposes.