Health Affairs study from Weill Cornell Medical College investigators.
The researchers looked at participants in New York's Primary Care Improvement Project (PCIP), a city program that began in 2005 and became a federally funded regional extension center (REC) in 2011. PCIP subsidized the cost of EHRs for physicians in underserved areas of New York. The doctors had to buy an eClinicalWorks EHR that had been modified to emphasize preventive care. PCIP provided technical assistance and coaching on quality improvement.
To measure the impact of the EHR and the technical assistance on quality of care, Weill Cornell researchers used a statewide database of paid claims from several health plans. Based on that data, they compared the changes in performance on PCIP's quality measures to changes in performance by matched controls who were physicians outside of PCIP.
Andrew Ryan, lead author on the paper and an assistant professor of public health at Weill Cornell Medical College, told InformationWeek Healthcare that the researchers didn't know whether the comparison practices had EHRs or not. But because they were small primary care practices in underserved areas (not necessarily in New York), he surmised that many or most of them lacked EHRs.
[ Is it time to re-engineer your clinical decision support system? See 10 Innovative Clinical Decision Support Programs. ]
The study results included quality improvement on all 10 of the PCIP measures and quality improvement on "EHR-sensitive" measures that previous studies had shown could be affected by the use of EHRs. Among these measures were breast cancer screening for women, retinal exam for patients with diabetes, chlamydia screening for women and colorectal cancer screening.
The researchers found that for all quality measures, PCIP was not significantly associated with quality improvement at any interval from 6 to 24 months after EHR implementation, regardless of how much technical assistance the physicians received. For the EHR-sensitive measures, quality improvement occurred in practices that received at least eight technical assistance visits and had been using the EHR for more than nine months. Quality did not improve in practices that received low or moderate amounts of assistance.
One implication of this study, according to Ryan, is that physicians need a lot of help to learn how to use their EHRs in quality improvement. To start with, Ryan said, there's a cultural barrier. "You've got a cohort of physicians who are used to practicing in a certain way and have been doing that their whole careers." They may not be computer savvy, Ryan pointed out, and to achieve the full potential of EHRs, they have to change how they document visits, how they write prescriptions, and how they decide what type of treatment patients get.
The PCIP practices' EHR included clinical decision support to alert them when patients were due for certain kinds of preventive or chronic care. But in general, Ryan said, physicians may ignore alerts when there are too many of them.
An earlier study of PCIP found that a large percentage of diagnoses were not documented as discrete data; as a result, PCIP practices underreported the services provided by doctors on six of 11 preventive care measures. This finding could provide another explanation for the apparent lack of quality improvement in many of the practices in the Weill Cornell study.
Could the study findings be applied to RECs elsewhere in the country that help physicians learn how to use their EHRs? PCIP predated the other RECs, Ryan noted, but it helped practices with few resources do a better job for patients in poor neighborhoods.
"PCIP was implemented under very challenging circumstances. And we found some evidence that for certain kinds of measures with a lot of technical assistance, it can improve quality. If it can be done in New York with difficult practices, it could be done in other places."
Clinical, patient engagement, and consumer apps promise to re-energize healthcare. Also in the new, all-digital Mobile Power issue of InformationWeek Healthcare: Comparative effectiveness research taps the IT toolbox to compare treatments to determine which ones are most effective. (Free registration required.)