Natural Language Processing Takes Center Stage In EHRs
NLP tied to speech recognition software can help physicians enter structured data into computers more easily.
At least three leading vendors—Allscripts, eClinicalWorks, and Greenway—have taken the first steps in this direction. All of these companies are applying voice recognition coupled with NLP to their ambulatory-care EHRs, and Allscripts is doing the same with its Sunrise acute-care products, including Sunrise Mobile MD II.
More Healthcare Insights
- The Untapped Potential of Mobile Apps for Commercial Customers
- How Healthcare Payers are using Customer Communications to Improve Productivity and Effectiveness
White PapersMore >>
Allscripts is working with M*Modal, a Medquist subsidiary that has a cloud-based voice recognition application. (Allscripts also offers its customers Dragon, the speech recognition program of Nuance Communications.) The company has already integrated M*Modal's NLP software into its Sunrise EHR, and the two companies are developing the application on the ambulatory care side.
[Is it time to re-engineer your Clinical Decision Support system? See 10 Innovative Clinical Decision Support Programs.]
Vern Davenport, chairman and CEO of M*Modal, told InformationWeek Healthcare that his company's product can convert voice to text and do "context enablement" to create discrete data elements that go into EHR templates. The application can put problems, procedures, and medications into the correct fields, he said.
Because M*Modal is cloud-based, its use is not limited to specific computer devices, and the application can learn from all of the physicians who use it, Davenport noted.
Bill Fera, MD, executive director of healthcare advisory services for Ernst & Young, believes that this characteristic will help M*Modal continue to improve its application. Overall, he said, applications that extract coded data from free text have gotten significantly better, compared with a few years ago.
EClinicalWorks (ECW) is coming out with a native iPad version of its EHR next summer; at the same time, it will release a new EHR feature called Scribe that uses natural language processing to help doctors codify their documentation. (Because of the limitations of iPad keyboards, NLP will be needed to aid this process.) ECW is using Nuance's Dragon product to extract the data from transcribed dictation.
In a demonstration of Scribe at the recent Health Information and Management Systems Society (HIMSS) conference, an ECW representative spoke into a microphone and had Dragon place the transcribed dictation into categories such as "chief complaint" and "family history." After finishing his dictation, he pressed an "extract data" button, and the codified version of the data popped up on the left side of the computer screen.
“Workflow is incredibly important in utilizing an EHR,” Girish Kumar Navani, CEO of eClinicalWorks, told InformationWeek Healthcare. “eClinicalWorks is simplifying technology by taking dictation as well as free text and transforming it into structured data. This will further streamline the use of EHRs in physician practices.” Jim Ingram, MD, chief medical officer of Greenway, agrees. "I really see the use of natural language processing as an enabler," he said in an interview with InformationWeek Healthcare. "It's providing the physicians who are later adopters of EHR technology with an easier way to move into this world and get the benefits of using an EHR."
About a year ago, he said, Greenway integrated NLP into its EHR, using M*Modal's engine, and it has been introducing this PrimeSPEECH product to customers for six months. Up to now, the application has been used to extract coded data for billing purposes, enhancing decision support, and pulling information into medical histories. In the future, he said, Greenway will add functionality for quality reporting and order tracking, among other things.
Greenway is also using the application in reverse, extracting elements of a Continuity-of-Care Document (CCD), such as medications, problems, and allergies, and merging those with the coded portions of transcribed dictation in the narrative version of the note.
Ernst & Young's Fera believes that the addition of NLP will reinforce the argument for using voice recognition in EHRs. "The main point is we're moving beyond speech recognition to natural language processing or understanding. With earlier iterations of speech recognition, the dictation generated a 'blob file' with no coded elements. So one argument against using that software was that the dictation couldn't populate your EHR fields. But now, with the continued evolution of language processing, not only can you document easily but it's getting into a formatted text."
Healthcare providers must collect all sorts of performance data to meet emerging standards. The new Pay For Performance issue of InformationWeek Healthcare delves into the huge task ahead. Also in this issue: Why personal health records have flopped. (Free registration required.)