Healthcare // Electronic Health Records
03:08 PM

Medicare Safety Data Criticized As Flawed

Hospital groups, Joint Commission president say use of claims data invalidates comparisons published on HospitalCompare website.

Health IT Boosts Patient Care, Safety
(click image for larger view)
Slideshow: Health IT Boosts Patient Care, Safety
As part of Medicare's safety and value-based purchasing initiatives, the Centers for Medicare and Medicaid Services (CMS) recently starting publishing additional information about hospitals' relative safety performance on its HospitalCompare website. The agency's initiative is encountering fierce opposition from hospital groups as well as the Joint Commission, the organization that accredits hospitals.

"Hospitals are working very hard to improve safety on a daily basis," Mark Chassin, MD, president of the Joint Commission, told InformationWeek Healthcare. "These [CMS] data don't really contribute much to our knowledge, understanding, or ability to improve, because they're not good quality measures."

Chassin argued that HospitalCompare's ratings of hospital performance on inpatient complications and mortality rates are flawed because they're based on claims data, not on clinical data from hospital charts or electronic health records (EHRs).

Patrick Conway, MD, chief medical officer of CMS and director of the agency's office of clinical standards and quality, agreed that clinical data is preferable to claims information for measuring hospital-acquired conditions (HACs) and deaths related to hospitalizations. But in a statement to InformationWeek Healthcare, he contended that we should not let the perfect be the enemy of the good.

"Reporting HAC data demonstrates CMS' commitment to improve patient safety," he said. "By making HAC data transparent, CMS sheds light on those preventable events where patients are harmed while seeking care."

Before adding the safety information, HospitalCompare already showed how well institutions did on various process measures, such as whether Medicare patients who'd had heart attacks received aspirins and beta blockers. Hospitals voluntarily report this data, which is extracted from their records. The website also features 30-day mortality data on patients with three primary diagnoses: heart attack, heart failure, and pneumonia.

The new postings include hospital-specific data on deaths of patients who had breathing problems after surgery, had an operation to repair a weakness in an abdominal aorta, or had a serious treatable complication after a procedure.

Additionally, HospitalCompare now shows how often particular hospitals' patients suffer complications such as a collapsed lung, a blood clot following surgery, or an accidental cut or tear during treatment. Earlier this year, CMS began posting data on other complications, such as pressure sores, falls, and catheter-related infections.

Aside from the competitive implications of comparing hospitals publicly on these measures, all of this data will be used in calculating how much hospitals might lose in Medicare reimbursement under CMS' value-based-purchasing program, scheduled to begin a year from now.

According to Chassin, the new metrics are "fatally flawed and misleading." The measures of hospital-acquired conditions, he says, are based on how secondary diagnoses--diagnoses other than the patient's primary problem--are coded for billing purposes. "That's not a reliable assessment of whether the outcome [i.e., the complication] occurred," said Chassin. "Secondary diagnoses are notoriously variable in how they're coded."

Conway insisted, however, that this data is reliable because "coders assign diagnosis codes based on the objective evidence presented to them in a medical record and in accordance with guidelines." If there's a diagnosis code that indicates a complication, he said, "this identifies a quality improvement opportunity for hospitals and provides information to consumers."

The problem with the three new mortality measures, as well as the earlier ones, Chassin said, is that CMS inadequately adjusts the data for the relative risk factors of patients when they were admitted to the hospital. Based on secondary diagnoses, the risk adjustment fails to reflect the severity of a patient's condition, he said.

Conway retorted, "We disagree that the mortality measures are flawed, and the community at large agrees with us. The risk-standardized mortality measures were developed by a team of clinical and statistical experts from Yale and Harvard universities, using a methodology that has been published in peer reviewed literature.

"The 30-day mortality measures are estimated with Medicare administrative data using models that were validated against medical record-based models. In these validation efforts, the results of the models with administrative claims and enrollment data were shown to be highly correlated with the results of models based on clinical data."

Chassin pointed out that the Joint Commission's Quality Check report card on hospital performance is based entirely on clinical data. Moreover, he said, the Joint Commission's measures have been tested in the real world and shown to be effective in helping hospitals improve their outcomes.

Ironically, CMS and the Joint Commission use many of the same "core" measures in their published reports, and the hospital data is gathered and analyzed the same way for both organizations. But not all of the measures shown in Quality Check are posted on the HospitalCompare website. For example, CMS is not publishing core measures for perinatal care or care in psychiatric hospitals.

In Conway's view, HospitalCompare, Quality Check, and other published comparisons of hospital performance can all be beneficial, despite their differences in methodology. "We welcome the availability of other data sources ... to help bring more transparency and visibility to health care quality," he stated.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Lisa Henderson
Lisa Henderson,
User Rank: Apprentice
10/25/2011 | 11:57:27 PM
re: Medicare Safety Data Criticized As Flawed
Collecting safety data for ANY healthcare related organization is ultimately supposed to inform decision-making for better patient care based on outcomes. Taking data and posting it ad hoc with metrics or benchmarks that aren't standard; and take a punitive stance are counter-productive to the whole point. There needs to be some reason applied to these moves--some kind of overall public health policy person that can add something to these initiatives. Transparency is great, but unexplained or misleading is only more confusing.

Lisa Henderson, InformationWeek Healthcare, contributing editor
Register for InformationWeek Newsletters
White Papers
Current Issue
Top IT Trends to Watch in Financial Services
IT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Join us for a roundup of the top stories on for the week of July 24, 2016. We'll be talking with the editors and correspondents who brought you the top stories of the week to get the "story behind the story."
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.