More Mobile EHRs Add Speech Recognition - InformationWeek
01:04 PM

More Mobile EHRs Add Speech Recognition

Third-party interfaces to iPad native applications allow more EHR vendors to voice-enable their products.

9 Mobile EHRs Compete For Doctors' Attention
9 Mobile EHRs Compete For Doctors' Attention
(click image for larger view and for slideshow)
While some companies still lag behind, EHR vendors are moving rapidly to enable their mobile products with speech recognition, either directly or through third-party interface vendors.

In the first category is Cerner, which just last month integrated Nuance Communications' speech recognition product with its ambulatory mobile EHR for iPads, according to Jon Dreyer, director of mobile solutions marketing for Nuance. In an interview with InformationWeek Healthcare, Dreyer added that in January, Epic embedded Nuance in its latest mobile EHRs for the iPhone and iPad.

Allscripts, which voice-enabled its Sunrise inpatient mobile EHR some time ago, doesn't yet have speech in its Wand ambulatory EHR. But an Allscripts spokesperson told InformationWeek Healthcare that Wand will be integrated with the MModal and Apple Siri voice recognition applications later this summer.

[ A recent usability survey puts Athenahealth on top. Read Athenahealth EHR Wins Usability Poll. ]

Other major companies are also using third-party vendors to provide an iPad-native front end that includes speech recognition to their EHRs. For example, Dreyer said, Nuance is integrated with Iconx, which makes a mobile interface for NextGen, and with MedMaster Mobility, which does the same for Greenway.

Small independent vendors have also built speech-enabled mobile EHRs for certain specialty areas, such as emergency departments, urgent care centers, and dermatology. For example, Nuance is embedded in Sparrow EDIS, Montrue Technologies' ED-specific iPad application, and Touch Medix's Lightning Charts, also designed for the ED. Modernizing Medicine and EZDerm have devised mobile EHRs specifically for dermatologists.

"The reality is that it's impractical to do any kind of documentation on mobile devices if you don't have speech," Dreyer stated. "Nobody's going to type into this thing or peck away at the onscreen keyboard. So the feedback from customers in the market today that are leveraging this has been very positive."

This is especially true in very busy environments such as the ED and in specialties where physicians must give their full attention to physical exams, Dreyer said. "It's either where the users themselves are mobile and jumping from one patient to the next, or where the physician is more hands on and having a computer between physician and patient gets in the way."

The ability to use speech for ordering simple meds is already available on some speech-enabled mobile products. But much more is coming down the line.

To begin with, Dreyer noted, Nuance wanted to make sure that its cloud-based speech recognition product for mobile EHRs was very fast and accurate. "That's the number one thing that the user needs and wants," he said. But now some of Nuance's developer partners are beginning to add "command and control" features for information retrieval. Introduced by Nuance less than year ago, these features allow users to ask the application to show them things like their patient lists, a particular patient's lab results, or when the patient's next appointment is.

By the end of the year, Dreyer said, Nuance will introduce the next generation of command and control, which uses interactive features that depend on its "clinical language understanding" (CLU), a form of natural language processing. This "virtual assistant," Dreyer explained, will manage dialog between the user and the application.

"For example, if a doctor wants to start a patient on a medication," he said, "the system can ask for clarifying information based on the information it needs. It knows it needs medication, dosage, frequency, dispensed amount, and refills in order to place a med order. So it'll continue to ask that, and CLU is parsing out the information you're telling it so you can get a string of information, and if you miss anything, it will let you know. It will also look into the drug interaction tables in the EMR and leverage that content and present it to the user in a very natural flow."

How does the speech recognition application access the EHR database to get the drug interaction rules? All mobile EHRs are thin clients interacting with a server that stores the EHR application and the data. So Nuance's cloud-based application can communicate with the server that stores the EHR to get the necessary information.

One thing that's still unclear, however, is how much clinicians want to trust a speech recognition system that could potentially misinterpret what they say when they're ordering a medication or a lab test. This has been a challenge for Intermountain Healthcare and MModal, which are jointly developing a speech recognition system that can be used in computerized physician order entry (CPOE). To date, their beta version has been used only for prescribing common, frequently prescribed meds. Lab and imaging orders and nursing orders are next on the roadmap.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Apprentice
6/17/2013 | 1:30:09 AM
re: More Mobile EHRs Add Speech Recognition
With mobile EHR systems, some sort of speech recognition program is crucial since attempting to type on the small onscreen keyboards is very cumbersome and I donG«÷t see physicians accepting that into their workflows. One of the major advantages on mobile EHR systems is that it allows physicians direct contact with their patient and provides a faster alternative to most of them, so having them type information would diminish this advantage. Of course, any speech recognition program needs to be continually tested and improved because you donG«÷t want the system to misinterpret what you say, especially when it comes to healthcare.

Jay Simmons
Information Week Contributor
2018 State of the Cloud
2018 State of the Cloud
Cloud adoption is growing, but how are organizations taking advantage of it? Interop ITX and InformationWeek surveyed technology decision-makers to find out, read this report to discover what they had to say!
Register for InformationWeek Newsletters
White Papers
Current Issue
Cybersecurity Strategies for the Digital Era
At its core, digital business relies on strong security practices. In addition, leveraging security intelligence and integrating security with operations and developer teams can help organizations push the boundaries of innovation.
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll