Wearables In Healthcare: Privacy Rules Needed

Johns Hopkins patient privacy violation didn't involve Google Glass or wearables but indicates why the healthcare industry must head off trouble with wearables in clinical settings.

Alison Diana, Contributing Writer

July 23, 2014

4 Min Read

Healthcare IT Cloud Safety: 5 Basics

Healthcare IT Cloud Safety: 5 Basics


Healthcare IT Cloud Safety: 5 Basics (Click image for larger view and slideshow.)

Johns Hopkins Hospital has agreed to pay $190 million to settle claims from thousands of former patients of gynecologist Dr. Nikita Levy, who committed suicide in 2013 after being accused of secretly recording women and girls during examinations. Although Levy is not alleged to have used wearable technologies, the case should serve as a lesson for CIOs and security officers on how to avoid potential abuse in their organizations.

The settlement, one of the largest recorded involving sexual misconduct, according to media accounts, came after investigators discovered more than 1,300 videos and images during multiple searches of Levy's office and home. Reportedly, Levy took the images via tiny cameras hidden in pens and key fobs. An alert co-worker became suspicious about a small device and contacted hospital administrators, who took it from Levy and called Baltimore police. Police and federal investigators found no evidence that Levy shared images online or with others, reports said.

[Online tracking technology is outpacing privacy protection -- it's time to revisit regulations. Read Web Tracking Advances Beat Privacy Defenses.]

Because patients' faces were not visible, Johns Hopkins treated all of Levy's patients as victims. The organization said, in a statement:

We have come to an agreement that the plaintiffs' attorneys and Johns Hopkins Health System believe is fair and properly balances the concerns of thousands of plaintiffs with obligations the Health System has to provide ongoing and superior care to the community. It is our hope that this settlement -- and findings by law enforcement that images were not shared -- helps those affected achieve a measure of closure. All funds will come from insurance.

This settlement, which has been formalized by the plaintiffs' attorneys and the Health System and given preliminary approval by the judge, will not in any way compromise the ability of the Health System to serve its patients, staff and community.

We assure you that one individual does not define Johns Hopkins. Johns Hopkins is defined by the tens of thousands of employees who come to work determined to provide world-class care for our patients and their families.

GoogleGlass_Medi_Screenshot.jpg

While the violation did not involve wearable computing devices such as Google Glass, Internet of Things (IoT) products, or smartwatches, Levy did use technology to illicitly record patients when they were at their most vulnerable. Could the case affect healthcare organizations' adoption of such devices -- especially as some medical and consumer advocates have already voiced concern over potential security and privacy flaws?

The simple answer is yes. As Johns Hopkins learned the hard way, one rogue clinician casts a long, costly shadow. So how can IT and healthcare professionals protect patients and organizations from similar intrusions, especially as healthcare providers and professionals adopt more portable, smaller technologies?

First, it's vital for CIOs, risk-prevention executives, chief medical officers, and clinicians to agree on stringent guidelines that meet healthcare, privacy, and security mandates. Then this team must ensure that all staff members learn these rules, understand how to report breaches, and receive regular reminders about these practices and penalties. Nobody should take hospital-owned wearable devices home, nor should they be allowed to operate personal wearables in clinical settings.

IT must create an auditable trail of any images created and/or stored by Glass or other small cameras to safeguard videos and pictures from any unauthorized usage. Healthcare providers may want to consider creating a separate release form if clinicians use Google Glass in the operating room or other medical setting, which clearly explains how and why physicians use the device and data and where images are stored. WiFi, the backbone of many of these devices, also must be strong and well protected to safeguard data from hackers. It's also wise to avoid certain sensitive examinations or even specialties for pilot programs.

Building a realistic cyber security risk profile for an organization is challenging. It's about framing metrics (many of which organizations probably already have) and tailoring them in such a way that they are contextualized and relevant. In the Making Cyber-Security Metrics Actionable webcast from Dark Reading, we'll explore what makes a good metric, how to tailor risk metrics, how to develop implementation strategies, and more. This webcast is available on demand.

About the Author

Alison Diana

Contributing Writer

Alison Diana is an experienced technology, business and broadband editor and reporter. She has covered topics from artificial intelligence and smart homes to satellites and fiber optic cable, diversity and bullying in the workplace to measuring ROI and customer experience. An avid reader, swimmer and Yankees fan, Alison lives on Florida's Space Coast with her husband, daughter and two spoiled cats. Follow her on Twitter @Alisoncdiana or connect on LinkedIn.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights