Common sense is overrated. Although conventional wisdom might suggest that some course of action is the right direction, all too often putting that "no-brainer" to the test in a controlled experiment yields surprising results, what you might call "uncommon sense." Put another way, common sense is no substitute for empirical data.
The history of medicine is filled with object lessons illustrating this point. Take electronic fetal heart monitoring. The technology for EFM became available in the late 1950s, and within a decade, clinicians across the U.S. were using it in the hopes of spotting pregnancy complications early on. Today, if an obstetrician doesn't use EFM and there's some sort of serious complication, it's likely he or she will be sued for malpractice.
Unfortunately, the technology was widely adopted before it was tested rigorously in clinical trials. Subsequent studies have shown that in most respects, EFM is no better than examining a patient with a low-tech Pinard stethoscope. The lesson here is pretty clear: Don't be too quick to jump on a technology "solution" before there's enough evidence to show it works.
So which IT "solutions" will improve patient safety and which won't--or worse, put patients at increased risk? Let's explore some of the evidence.
First, The Bad News
When researchers recently looked at how nurses and nurse managers used automated medication-dispensing systems, computerized order-entry systems, and checklists to watch for medical errors, they found that, although the tools were effective in alerting nurses to errors, many RNs didn't act on those alerts. More than half of the nurses surveyed said they had seen other clinicians take dangerous shortcuts, for instance, but only 17% discussed it with colleagues. Similarly, more than 33% of the nurses saw incompetent behavior that "had led to a near miss or actual harm to a patient." But only 11% of this group took any action.
The reason for the inaction: Roughly six out of 10 nurses remained silent because they were afraid to speak up or couldn't get others to pay attention. The bottom lines, according to the investigators from the American Association of Critical Care Nurses and their associates: "Tools don't create safety; people do." I would take it a step further. Unless we create a zero-tolerance culture in which clinicians can't be intimidated, bullied, or ignored when they see mistakes, patient safety is only a pipedream. And that's not to belittle the value of healthcare IT. It simply illustrates that IT is only one factor in a complex environment.
Unfortunately, clinician inaction isn't the only thing that can derail health IT, notes Christian M. Pettker, MD, an assistant professor at Yale School of Medicine and an expert on patient safety. "There are a lot of gains associated with EHRs, but there are also some sacrifices," he says. "Our access to information like prior admissions, lab data, and care provided by other clinicians is helpful because it can prevent us from repeating tests and helps us come to a more timely diagnosis."
However, because clinicians know that the information is so readily accessible, they tend to communicate person to person less frequently, Dr. Pettker says. It's easy to just assume others have seen important patient data and are acting on it simply because it's in the system. "If I order a medication that has to be given immediately, I can't assume that if I put that order into the computer and press the STAT button, that the patient will be given it right away," he says. "I still have to talk directly to the nurse on the unit and probably the pharmacy as well. These healthcare providers are not glued to their computers. They are often taking care of other patients. And there may not be an alert system that sends an urgent message to their smartphone, for instance, to get their attention."