Recent Wall Street Journal commentary ripped into health IT, but the critics offered very little in the way of viable solutions.
IW 500: 10 Healthcare IT Innovators
(click image for larger view and for slideshow)
A recent Wall Street Journal editorial by Stephen Soumerai and Ross Koppel led with the title: "A Major Glitch for Digitized Health-Care Records." Soumerai, a professor at Harvard, and Koppel, from the University of Pennsylvania, argue that the costs of health IT are prohibitive and cite a number of studies that purport to show the current federally supported EHR initiative hasn't really saved the nation anything.
The emphasis in the editorial was on dollars and cents, and was based mostly on sheer guesswork. Soumerai and Koppel cite several studies but they give virtually no before and after costs; instead we are expected to have faith they support the excessive cost argument. Neither do the editorialists attempt to suggest a solution to the problem if, in fact, one exists.
They put the blame on high-tech companies, without laying a significant blame on hospital and practitioner politics. The editorial also says that standards are needed but doesn't suggest what they should be, how they should be achieved, or whether they will work. Not only that, but it doesn't discuss financial penalties for institutions that do not meet established standards.
I do agree with the authors' point that IT training is a major, expensive undertaking, given the enormous number of stakeholders in hundreds of locations. Training is likely a mountain of Everest proportion, with thousands to be trained in the systems created.
I also agree that we need to take a serious look at the technology standards. The American health IT system will not work if every Tom, Dick, and Harry company launches its own idea of what software should be used and what database structures will be robust enough.
When an oil company (a single entity) is looking for the location of wells abandoned years ago in order to see if new technology can make them productive again, and finds the data is in an unknown format entered with software that is now obsolete on a computer that was discontinued, what hope has a laissez faire government approach with thousands of stakeholders unless there is a strong, stable group in charge of overall development? One hospital I know of uses 13 different systems! That's pretty unmanageable.
Whether we like government control or hate it, the only way electronic health systems will work is with tight control, NOT with individual companies doing their own thing.
Soumerai and Koppel seem to have a bias against EHRs. While I do not think a lifelong e-patient record will be quickly achievable nationwide, there are segments that are achievable and U.S. stakeholders should consider them; they would cost far less than the complete package. These partial solutions tend to get overlooked in the general negative discussion, but providers can implement them until a total package is achieved, at which point they would be integrated into the full system.
How To Avoid Huge Costs
Avoiding huge costs basically requires setting strict standards, developed by competent stakeholders. It also requires adherence to those standards. The article cited how important standards were but gave no idea of what standards are needed and achievable. So here are a few, along with a few suggestions.
-- Legal: This kind of standard is likely impossible, but we still need to define who owns the patient data and who will be allowed to view the data.
-- Database: A dialogue with NASA makes sense. It has experience in storing and retrieving huge amounts of data from orbiting satellites such as NOAA and LANDSAT.
Technology: The healthcare industry needs to establish a standard software package that will last a lifetime. COBOL and FORTRAN have lasted close to 60 years because the industry had a huge amount of data and could not afford to transfer to other packages. Operating systems change every few years. We need a simple, easy-to-use software suite. Some packages that are now on the horizon use a gene-like structure that IT managers could also use in the developing nanotechnology, particularly with medical systems. The gene approach to software can have applications in any ethnic language.
Teaching: This need not be expensive. Hundreds of suitable people will likely vie for a chance to work in developing course material.
Security: The gene approach in technology reduces the problem but data still has to be secure right from data entry, through to database, through to retrieval. This in itself creates a security problem because of the large network activity that will be created.
Most serious IT executives recognize that the current EHR systems now in place are a work in progress. Universal standards and better training will eventually create the kind of robust infrastructure needed to provide cost effective patient care.
Bernard A Hodson, formerly a tenured full professor of computer science at the University of Manitoba, has served as a computer analyst for UNIVAC, and has worked for Symbionics, a computer service bureau and the Canada Centre for Remote Sensing.
InformationWeek Tech Digest August 03, 2015The networking industry agrees that software-defined networking is the way of the future. So where are all the deployments? We take a look at where SDN is being deployed and what's getting in the way of deployments.