Information technology is not medicine's enemy but an ally that can ultimately improve patient care and reduce costs.
The latest digital issue of InformationWeek Healthcare featured an intriguing article by Paul Cerrato suggesting that some of my ideas regarding evidence-based medicine are a source of animosity between CIOs and their physician stakeholders--physicians whom he believes consider information technology "the enemy" of good medicine. His article touches on several key issues that should be top of mind for every physician leader and CIO today. But is IT really the enemy of good medicine?
Is Medicine An Art or A Science?
At the heart of the article is a very old debate: Is medicine an art or a science? I've never understood why it can't be both. But the more important question is the degree to which we pursue the art at the expense of science, Carl Sagan, my childhood idol, once wrote of the origins of modern medicine:
"Hippocrates introduced elements of the scientific method. He urged careful and meticulous observation: 'Leave nothing to chance. Overlook nothing. Combine contradictory observations. Allow yourself enough time.' … He stressed honesty. He was willing to admit the limitations of the physician's knowledge."
The man behind the oath every practicing physician takes was very much an advocate for science. And any physician advocating art over science is ignoring centuries of transformative medical progress made through science. Personally, I don't run into many physicians like that anymore--they are literally a dying breed.
Is Evidence-based Medicine Cookbook Medicine?
"Cookbook medicine" is an extremely poor characterization of evidence-based medicine: It is polarizing, narrowly focused, inaccurate in its objectives, and confuses the process of gaining insight with the process of deciding what to do with the insight. Sagan writes elsewhere in his book:
"Science by itself cannot advocate courses of human action, but it can certainly illuminate the possible consequences of alternative courses of action. … Science invites us to let the facts in, even when they don't conform to our preconceptions."
As one example, SAS has been working with a leading cancer care institution to create EBM (evidence-based medicine) software that lets a medical practitioner obtain specific information about a particular patient, and then look at summarized medical outcomes from all the various treatments of prior patients that resemble this one. The software doesn't tell the physician what to do--every patient is different, and cancer is a complex family of diseases. But the insight from this evidence gives the medical practitioner more information to make better decisions that produce high quality medical outcomes and lower costs.
Cerrato's article also points out that many people equate EBM with the results from clinical trials--a valuable model of research that is incapable of fully addressing real-world heterogeneity. But as just described, EBM is not limited to traditional clinical research. Further, we have relied on clinical trials for medical insight because we have not had alternatives such as information technology that could provide meaningful insight into real-world experiences (e.g., co-morbidities). So if we want to address the shortcomings of clinical research, information technology in the service of EBM is not the disease, it is the cure.
Is The Evidence Poor?
The article also raises concerns about the reliability and validity of conclusions drawn from scientific and statistical approaches to medical outcomes. I agree with two points: clinical research does not address broad enough populations of patients, and it does not mimic the real world. And neither does the experience of an individual physician in a community care setting. EBM is a credible way of overcoming the cognitive limitations of the human mind and the limited experiences of individual medical practitioners.
One of the other ideas in the article was the assertion that Type 2 statistical error erodes the value of EBM and technology. For those that slept through that Stats 101 class, Type 2 statistical error in this case refers to the very real risk of missing a potential beneficial treatment due to flaws in research design and execution. But the article fails to mention two other critical risks: Type 1 statistical error and bias.
Type 1 statistical error is the risk of concluding a therapy works that actually does not. Many practicing physicians get hit with that risk every day--when they incorrectly conclude that because drug A or therapy B worked on the last patient they treated, it will work on the next patient. The question on the table is whether we want to address this risk, because one advantage that science offers over art is that it better balances the risks of Type 1 and Type 2 error.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?