According to the Institute of Medicine in a brave and pivotal report called "To Err is Human" published in 1999, somewhere between 46,000 and just under 100,000 Americans die in hospitals annually from medical mistakes. We're not talking malpractice such as accidents resulting from incompetent docs or nurses. We're talking about errors made by fallible humans that were not discovered in time to prevent hurting a trusting patient, and many of us involved in this issue believe those figures are substantially low.
But where on earth is the crossover point between aviation and medicine? Why would physicians and nurses and hospital leaders look to the flying of airplanes for improvements in patient safety? Isn't the practice of medicine far more complex and subjective and variable than flying even the most complex airplanes?
The truth is, aviation and medicine are very closely related because they both depend utterly on imperfect humans, whether in operating rooms or 747 cockpits.
When you accept the reality that even the best and the brightest among doctors as well as airline pilots will still make mistakes despite decades of experience and dedication and training, you realize that a system not designed to expect and safely absorb human error will constantly suffer from those human mistakes.
In commercial aviation, our vast improvements in safety resulted from learning to expect human errors and building systems such as checklists and procedures and standard practices to minimize error but still absorb any reasonably expected screw-up before it progressed to tragedy.
We learned to teach co-pilots, for instance, that even if God himself is in the left seat, they still have to speak up instantly when something is wrong. Instead of the autocratic leader who needed and accepted no advice, we've redefined leadership by creating strong captains who know how to create a team to help them make better, safer decisions. In other words, we built exactly what health care is now struggling to create: a cooperative, collegial system accepting of its human weaknesses and completely opposite to the traditional hierarchical medical structure that assumes no physician can make a critical error.
Moreover, health care now understands that the systemic failures that most often lead to patient safety disasters (wrong limb amputated, double mastectomy on a woman without cancer, death due to a misplaced decimal point on a crucial medication, etc.) stem from the very same human problems in communication, cooperation and procedure that had caused far too many airline crashes.
Here's the main lesson: Humans can never be error-free, so if you build a safety system that expects only perfect performance by pilots, doctors, nurses or administrators, you "prewire" your system to achieve consistent disaster. Airliners flown by a captain who can't or won't hear the concerns of subordinates are the exact equivalent of a senior physician who does not have the benefit of critical corrections from those of lesser rank when that senior person does something human and makes a serious mistake.