— Recently, the National Academy of Sciences distributed a well-publicized report stating that each year, between 40,000 and 100,000 deaths in this country are a consequence of medical error.
After reading "To Err Is Human," the report put out by the NAS’s Institute of Medicine, I was left with one dominant impression: the problem of medical error is not so much a medical problem as a systems problem, almost a quasi-mathematical one.
I’m a mathematician, so you may suspect this conclusion is a result of my professional myopia, of my confusing an ankle for an angle perhaps. Let me explain.
People readily understand instances of a physician’s incompetence, a nurse’s negligence, a pharmacist’s lapse. These are all agents who are culpable, and making mistakes is in everyone’s realm of experience.
What’s much more difficult to grasp is a complex system with many interacting parts that are tightly coupled and interdependent. Such systems are necessary in modern health care, but unless an impersonal analysis of their structure is made, we likely won’t put a significant dent in the intolerably large number of medical errors or in the sickeningly unnecessary deaths they lead to.
Look to the Skies
It is possible. Consider the airline industry whose safety record is extraordinary and getting better. As has often been noted, a person has one chance in 7 million of dying in any given flight with a commercial airliner in this country, and such deaths, befalling a cross-section of healthy Americans, are never overlooked by the media.
Accidents are rare and spectacular and, as the saying goes, pilots are the first to arrive at the scene.
In the health care industry, fatal accidents are common — occurring in from 1 in 200 to 1 to 400 visits to a hospital — are woefully underreported and happen to sick, isolated individuals and not to any of the medical staff.
The airline industry, of course, differs in countless ways from the health care industry. Medical outcomes often come in shades of gray, and the questions of what constitutes an error and whether a death can be attributed to any given error are sometimes difficult to answer.
Nevertheless, the systems that are in place in the airline industry provide an unfriendly environment for errors to occur or to propagate.
Some examples: Engineers don’t scribble their analysis of a problem part on a piece of paper and leave it for someone on the next shift to decipher. Factory personnel don’t test parts in isolation but run simulations to see how they work together. Managers don’t introduce new equipment without an extensive program to train people in its use.
Pilots and others report accidents and near-collisions to various authorities for tabulation and investigation. Designers build in fail-safe and ergonomic improvements to minimize confusion and the need to rely on memory. No one takes pride in doing his or her job on three hours sleep or in being aloof from his or her support team.
In general, accidents occur more frequently in complex systems.
As an unrealistically simple illustration, take a procedure that depends on 30 independent component parts (tubes, machines, scalpels, medications, rates and doses, clamps, etc.) working correctly. The laws of probability tell us that even if each component works flawlessly 999 times out of a 1,000, the procedure will still fail 3 percent of the time (since 0.99930 = .97).
The dependencies in the procedure have to be anticipated and analyzed, the probabilities of component failure have to be reduced, and redundancies and backups must be built in. Computers and software should be used in prescribing drugs, in recording patient histories and wherever else is appropriate.
These and the other efforts recommended in “To Err Is Human” have little to do with medicine and much to do with operations research and systems theory.
Let’s help insure that these recommendations are implemented by making the following New Year’s resolution: Let news coverage of every plane crash remind us of the tens of thousands of almost invisible deaths due each year to medical error.
The story of these deaths and of the systematic errors that lead to them is harder to cover but more important than that of the random plane crashes that so intrigue us.
Professor of mathematics at Temple University, John Allen Paulos is the author of several books, including A Mathematician Reads the Newspaper and Once Upon a Number. His Who’s Counting? column on ABCNEWS.com appears on the first day of every month.