An informed newcomer looks critically at EHRs, clinical decision support software, and proprietary databases and wonders: is health IT all it's cracked up to be?
I'm the new kid on the block. I join InformationWeek Healthcare as editor, after having spent 27 years wearing several other hats including medical editor, nursing editor, and clinician. And with this experience as a backdrop, I approach healthcare IT with a sense of "guarded optimism."
You've probably heard that expression used by surgeons when assessing patients with a life-threatening condition who have just been through a complex medical procedure and look like they will survive--but the docs don't want to overstate their enthusiasm. That's how I think about the value of IT in solving America's healthcare problems. That's not to suggest that I think healthcare IT is at death's door. In fact, I'm convinced that electronic health records (EHRs), telemedicine, computerized physician order entry (CPOE) systems, and the like have tremendous potential to improve patient care and cut costs. But I maintain a healthy skepticism. I'm not quite sure how much of this potential has actually been realized.
With that in mind, here's my collection of preconceived ideas about healthcare IT as an informed outsider ready to dive into the pool headfirst.
User-Unfriendly Software. Having worked closely with doctors and nurses in the trenches and in academia over the years, I have heard the complaint that software designers don't always understand their needs because designers haven't "walked a mile in their shoes." They are simply too far removed from the bedside to offer the kind of functionality needed for effective patient care. Similarly, I've heard that the workflow embedded in many applications isn't user-friendly or intuitive enough, and that the IT folks speak a language that clinicians have a hard time understanding.
Unreliable Systems. And then there are the unexpected software glitches to contend with. Earlier this year, the Swedish Medical Center's centralized electronic medical records (EMR) system shut down unexpectedly for four painful hours. One report said the glitch affected "about 600 providers, 2,500 staffers, and perhaps up to 2,000 patients." Fortunately, no safety issues were reported. That's nothing compared to one software glitch that affected Veterans Administration health centers in 2009. That electronic hiccup led to several patients receiving the incorrect dosage of their medication and delays in treatment.
Too Much Government Pressure? Are the billions of dollars that the federal government has put on the table for EHRs causing IT professionals and vendors to overstate the value of all these electronic tools or to implement them too rapidly? I can't point to any hard data to support this nagging concern, but human nature being what it is, I suspect that greed will win the day, much as it did when diagnosis-related groups and other "innovations" were put in place a few decades ago. With so much money in play, I'm sure some hospitals and group practices will try to game the system--with taxpayers and patients picking up the bill.
Education: The Good And Bad. On a more positive note, I've been impressed with what IT has brought to the field of professional education. There's PubMed, for instance, the huge, government-sponsored bibliographic database that contains more than 20 million citations from medical and life science journals. Even more impressive are some of the proprietary databases that offer practical advice on diagnosis and treatment. One example is UptoDate, a clinical decision-support system available on the Web and via mobile devices that covers some 17 specialties and about 8,500 topics. In essence, it's like having a library of medical textbooks at one's disposal, updated every few months.
Unfortunately, one healthcare IT area that's not getting enough attention is patient education. Having read through lots of blogs and reviewed websites for many of the movers and shakers in IT, I find that most of the emphasis is on the provider side, i.e., what IT can do to improve data entry, diagnosis and treatment, evaluation and management coding, and so on. But not much emphasis has been put on the ultimate end user, the person being treated.
Granted, in many acute care scenarios, patient education isn't a top priority. When someone comes into the emergency department to have a fractured tibia mended, he or she doesn't need a whole lot of detailed, complex instructions on how to care for the fractured limb. But so much of patient care revolves around treatment of chronic degenerative diseases like diabetes and coronary artery disease. In situations like these, patient education is crucial because much of the treatment is not something done to patients, but interventions that we expect patients to do themselves, including changing their lifestyle. Putting serious dollars into technology that improves this kind of patient education makes sense, and I plan to spend more digital ink reporting on innovations in this area.
Where's The Hard Data? My biggest concern, however, is the relative lack of convincing research to show that healthcare IT improves clinical outcomes. Over the years, academic medicine has taken a critical look at the research methodology used to support various treatment protocols, and what has emerged is a subspecialty sometimes referred to as evidence-based medicine. Healthcare IT needs to be subjected to the same scrutiny. We need more randomized controlled trials (RCTs) to replace the anecdotal reports that only hint at the effectiveness of EHRs, clinical decision-support tools, and the like.
Recently, David Blumenthal's team at the Office of the National Coordinator for Health Information Technology published a review in Health Affairs that concluded that "92% of the recent articles on health information technology reached conclusions that were positive overall." Sounds impressive. I'm not so sure. How many of these articles were RCTs? How many reported on statistically significant results that in practical terms were too small to be clinically significant? How many studies that have questioned the benefits of health IT never were published because of the tendency of research journals to favor positive studies?
Thus my initial comment: I remain guardedly optimistic. A year from now, I may look back at and say: "What in the world was I thinking?" Meantime, I hope you, our readers, will offer this newbie some of your wisdom and set me straight when you think I'm off base. I won't always agree with you, but the editors at InformationWeek Healthcare are always open to informed discussion and debate.