Put another way, electronic health records have lots of check boxes to fill in to document symptoms, diagnoses, drug choices, orders for procedures, and so on. But the natural tendency of physicians and nurses is to think in narrative terms and to dictate those stories as free text.
One of the solutions to come along in recent years to resolve this problem is voice recognition software. Most provider organizations are familiar with Nuance's Dragon Medical, for instance, which lets clinicians dictate their notes directly into an EHR instead of sending them to first be transcribed into text. Once the program generates the text, the clinician can edit it and sign off on the final version.
[ Is it time to re-engineer your clinical decision support system? See 10 Innovative Clinical Decision Support Programs. ]
The major problem with this approach has been that it provides a huge repository of free text narrative that can't populate all the structured components of a hospital's clinical information system, leaving a potential treasure chest of valuable insights and facts in limbo.
Enter natural language processing (NLP), which has the ability to understand terms used in everyday speech--including a variety of synonyms for the same term--and extract the most relevant terms from a narrative to populate all those annoying check boxes and drop-down menus that we love to hate.
Nuance has partnered with several other companies to combine NLP with voice recognition, taking the technology to the next logical level by developing what's being called clinical language understanding or CLU.
MModal and Nuance, for instance, have created a CLU tool that keeps doctors informed of relevant patient data as they add their comments to the EHR. According to Chris Spring, MModal's VP of health IT, the platform "listens" to a clinician's dictation in real time and tells her if she's missing any vital information already in the patient's chart. So, for example, if the doctor is dictating notes about a patient with chronic obstructive pulmonary disease and she's unaware that the EHR contains spirometry readings--which measure lung capacity--the system will alert her to the existence of those readings.
Similarly, Dragon Medical 360/M.D. Assist, a collaboration between Nuance and 3M, uses CLU to query clinicians as they enter data into a patient's EHR to help provide a more complete document. It detects missing details and unclear associations between relevant findings, and helps pinpoint a specific diagnosis. If, for example, the doctor documents respiratory failure, the software prompts him to specify if it's acute or chronic respiratory failure. Those extra details often can mean the difference between quick insurance coverage and needless delays.
ChartLogic, an EHR vendor, also is providing similar dictation and natural language processing capabilities in a product it calls Stella, a voice recognition overlay compatible with leading EHR systems. Stella uses the same type of cloud-based natural language processing employed in Apple’s Siri, and can be used in conjunction with iPads, iPhones, and other mobile devices to interact with a hospital's EHR system.
There's little doubt in my mind that CLU has the ability to speed up the process of clinical documentation, improve workflow, and perhaps even increase third-party reimbursement for services rendered. What's not clear at this point is whether any of this advanced technology will make patients healthier. And since the thought leaders in healthcare keep talking about the need to make medical care more "patient centric," isn't that what we're ultimately aiming for?
The 2012 InformationWeek Healthcare IT Priorities Survey finds that grabbing federal incentive dollars and meeting pay-for-performance mandates are the top issues facing IT execs. Find out more in the new, all-digital Time To Deliver issue of InformationWeek Healthcare. (Free registration required.)