— Recently, the National Academy of Sciences distributed a well-publicized report stating that each year, between 40,000 and 100,000 deaths in this country are a consequence of medical error.
After reading "To Err Is Human," the report put out by the NAS’s Institute of Medicine, I was left with one dominant impression: the problem of medical error is not so much a medical problem as a systems problem, almost a quasi-mathematical one.
I’m a mathematician, so you may suspect this conclusion is a result of my professional myopia, of my confusing an ankle for an angle perhaps. Let me explain.
People readily understand instances of a physician’s incompetence, a nurse’s negligence, a pharmacist’s lapse. These are all agents who are culpable, and making mistakes is in everyone’s realm of experience.
What’s much more difficult to grasp is a complex system with many interacting parts that are tightly coupled and interdependent. Such systems are necessary in modern health care, but unless an impersonal analysis of their structure is made, we likely won’t put a significant dent in the intolerably large number of medical errors or in the sickeningly unnecessary deaths they lead to.
It is possible. Consider the airline industry whose safety record is extraordinary and getting better. As has often been noted, a person has one chance in 7 million of dying in any given flight with a commercial airliner in this country, and such deaths, befalling a cross-section of healthy Americans, are never overlooked by the media.
Accidents are rare and spectacular and, as the saying goes, pilots are the first to arrive at the scene.
In the health care industry, fatal accidents are common — occurring in from 1 in 200 to 1 to 400 visits to a hospital — are woefully underreported and happen to sick, isolated individuals and not to any of the medical staff.
The airline industry, of course, differs in countless ways from the health care industry. Medical outcomes often come in shades of gray, and the questions of what constitutes an error and whether a death can be attributed to any given error are sometimes difficult to answer.
Nevertheless, the systems that are in place in the airline industry provide an unfriendly environment for errors to occur or to propagate.
Some examples: Engineers don’t scribble their analysis of a problem part on a piece of paper and leave it for someone on the next shift to decipher. Factory personnel don’t test parts in isolation but run simulations to see how they work together. Managers don’t introduce new equipment without an extensive program to train people in its use.
Pilots and others report accidents and near-collisions to various authorities for tabulation and investigation. Designers build in fail-safe and ergonomic improvements to minimize confusion and the need to rely on memory. No one takes pride in doing his or her job on three hours sleep or in being aloof from his or her support team.
In general, accidents occur more frequently in complex systems.
As an unrealistically simple illustration, take a procedure that depends on 30 independent component parts (tubes, machines, scalpels, medications, rates and doses, clamps, etc.) working correctly. The laws of probability tell us that even if each component works flawlessly 999 times out of a 1,000, the procedure will still fail 3 percent of the time (since 0.99930 = .97).