For the last couple of years, I've been educating Health First employees that mobile devices are worth their weight in gold -- certainly as incredible, business-enhancing tools that make us productive anywhere and anytime. In a larger sense, they're priceless because the data they carry represents a much greater financial liability. I've challenged executives, clinicians, and IT experts to look at mobile devices not as a stylish, expensive tech asset, but rather as a million-dollar data liability. A lost or stolen device can (and routinely does) cost companies millions, I have argued. More recently, unsecured devices represent the potential for billions in remediation and restitution.
I had hoped this message would be heard -- but not by a community intent on exploiting this new reality. Recently, for example, I read about the armed robbery of a physician, who kept his life in exchange for handing over his laptop, phone, and encryption keys and passwords. I should not have been shocked -- or even surprised.
Criminals know how easily they can breach even our best defenses because of how interconnected and "available" we have become, and by how accessible and immediately responsive we as a society require our business and (in my world) clinical teams to be. It might be time for us to really consider and perhaps reassess the true costs of making data available anywhere and at any time.
[As we head into 2015, where do the greatest healthcare security vulnerabilities lie? Read Healthcare Security In 2015: 9 Hotspots.]
So what has changed? The value of medical identity information continues to grow, and according to multiple reports the target placed on the health community far surpasses other industries. 2013 evidenced a 20% increase in medical identity theft, and some industry experts suggest the reason is simple: We're a soft target, with a 50 to 1 return on each identity stolen when compared to financial identity theft. A CBS Nightly News report suggested the best quality data and most complete identity to obtain for nefarious purposes comes from the medical community; in Miami, for example, $1,000 for 100 names was the going rate.
That's low, according to those close to DarkNet communities. For $50, they say, a criminal can purchase a medical identity that mirrors their own ailments so they can seek "free" medical services that would not raise red flags to a clinician. Need a new knee? Here's a medical identity for someone who is about your size, age, and gender, and whose medical history indicates a replacement joint is in his or her future. For $250, the criminal will throw in a fake ID and insurance card to match the identity.
By the time the patient and insurance company victims figure out a $50,000 fraud has been committed, you'll be out of rehab and never heard from again. This is real and expensive: It's estimated to cost the industry between $35 billion and $80 billion each year. That's a big spread, you might think -- and you'd be right, because we simply don't know exactly how much of our healthcare delivery dollar we're spending on fraud.
In November, Interpol, the Federal Bureau of Investigation, and the Department of Homeland Security filed charges against Silk Road 2.0, a DarkNet site dedicated to the brokerage of stolen information, hacking for hire, and the Craigslist equivalent for all things illegal. For sale: controlled substances delivered direct to you, large- and small-caliber weapons, human trafficking, child pornography, murder for hire, and all manner of identity information.
Sadly, the original Silk Road DarkNet site had been shut down only one year earlier. The Internet will spawn another market place -- it probably already has -- and illegally obtained data will continue to retain its value.
So what have we done to change things? We've tried encrypting stored data, we've tried enforcing secure remote access and virtualization for client applications. The best and most innovative changes in our industry seek to distance the data from a device, which works great when devices are lost or stolen, but not in cases like the aforementioned robbery. All these protections were circumvented because the keys and passwords -- the means for legitimately providing access -- were also held hostage.
Imagine if the robbery victim had been a database administrator – someone who remotely manages the entire patient index -- instead of a doctor who sees a few hundred patients. Millions of individuals would have been at risk. Your average doctor probably doesn't have that level of access to raw data, so the compromise was limited to what was stored or accessed between the time of the mugging and notification to change the passwords. I don't have that kind of access either, for all the bad guys and gals out there. But if I did, I would not hesitate or advise anyone to hesitate to hand over the keys to the data kingdom if my life or the life of someone in my family was at risk. Sounds like a movie script -- and there have been a few action thrillers that make hostage-induced insider data exfiltration out to be a very cloak-and-dagger plot line.
Dark Reading's new Must Reads is a compendium of our best recent coverage of vulnerability management. Learn how a design flaw in an older version of the SSL encryption protocol could be used for man-in-the-middle attacks, how the Mayhem botnet malware kit serves enterprising criminals, why it's time to raise the bar on static analysis, and more. Get the Must Reads: Vulnerability Management issue of Dark Reading today. (Free registration required.)