It was during the fogbound afternoon of March 27, 1977, on Tenerife in the Canary Islands, that a revolution in both aviation and medical safety began with a single human error: a Boeing 747 captain's mistaken assumption that a critical pre-flight step had been performed.
The first and second officers almost caught the mistake. But rather than stop the process, the two other crew members convinced themselves that the captain (who was the chief pilot of KLM Royal Dutch Airlines) was too senior and too good to be wrong. Captain Jacob Van Zanten couldn't possibly be starting a takeoff roll with a half-million pound airliner in thick fog without first obtaining the takeoff clearance from the control tower, and without the assurance that the unseen runway ahead was clear of other aircraft.
But Van Zanten had forgotten about the clearance in his haste to get under way.
Without the benefit of hearing his subordinates' concerns, the captain accelerated down the fog-shrouded runway only to break out just in time to find a Pan American 747 sitting sideways on the runway just ahead. Too fast to stop, too slow to fly over the other jumbo, the two huge aircraft collided, killing 582 people in the worst airline accident in aviation history. Only a few aboard the Pan Am jet escaped with their lives. No one -- including Van Zanten -- survived aboard the KLM plane.
No negligence, no carelessness, no incompetence caused the disaster. Instead, the accident arose from the simple truth that even a chief pilot -- even one of the best airline pilots on the planet -- could make a human mistake that the "system" should have been able to catch in time.
The Tenerife disaster forever changed aviation's approach to safety by sparking a revolution in the way pilots cooperate and openly communicate with each other -- and that revolution is the prime reason we have now passed four years without a major airline accident in the United States. In many respects, those 582 lives were not lost in vain, since the lessons learned have been thoroughly applied to save countless others in aviation, and now in medicine!
In the past fifteen years, the principles of teamwork and communication and recognition of the eternal propensity for human failure so glaringly illuminated on that debris-strewn runway at Tenerife have begun to be adopted by physicians and nurses and pharmacists as American health care comes to grips with the critical need for its own safety revolution.
Traditionally, doctors have been trained to expect themselves to perform without error and without the need for anyone else's advice, and that model is simply dangerous, cutting off vital information, assuming nonexistent perfection, and creating a national cottage industry of talented mavericks wholly unsupported and unable to benefit from the mistakes of others.
In short, while airline flying has become a low-risk industry and literally the safest method of travel, health care (in terms of the possibility of being unnecessarily injured by medical mistake) is very high-risk -- and not just because hospitals tend to attract sick people.
According to the Institute of Medicine in a brave and pivotal report called "To Err is Human" published in 1999, somewhere between 46,000 and just under 100,000 Americans die in hospitals annually from medical mistakes. We're not talking malpractice such as accidents resulting from incompetent docs or nurses. We're talking about errors made by fallible humans that were not discovered in time to prevent hurting a trusting patient, and many of us involved in this issue believe those figures are substantially low.
But where on earth is the crossover point between aviation and medicine? Why would physicians and nurses and hospital leaders look to the flying of airplanes for improvements in patient safety? Isn't the practice of medicine far more complex and subjective and variable than flying even the most complex airplanes?
The truth is, aviation and medicine are very closely related because they both depend utterly on imperfect humans, whether in operating rooms or 747 cockpits.
When you accept the reality that even the best and the brightest among doctors as well as airline pilots will still make mistakes despite decades of experience and dedication and training, you realize that a system not designed to expect and safely absorb human error will constantly suffer from those human mistakes.
In commercial aviation, our vast improvements in safety resulted from learning to expect human errors and building systems such as checklists and procedures and standard practices to minimize error but still absorb any reasonably expected screw-up before it progressed to tragedy.
We learned to teach co-pilots, for instance, that even if God himself is in the left seat, they still have to speak up instantly when something is wrong. Instead of the autocratic leader who needed and accepted no advice, we've redefined leadership by creating strong captains who know how to create a team to help them make better, safer decisions. In other words, we built exactly what health care is now struggling to create: a cooperative, collegial system accepting of its human weaknesses and completely opposite to the traditional hierarchical medical structure that assumes no physician can make a critical error.
Moreover, health care now understands that the systemic failures that most often lead to patient safety disasters (wrong limb amputated, double mastectomy on a woman without cancer, death due to a misplaced decimal point on a crucial medication, etc.) stem from the very same human problems in communication, cooperation and procedure that had caused far too many airline crashes.
Here's the main lesson: Humans can never be error-free, so if you build a safety system that expects only perfect performance by pilots, doctors, nurses or administrators, you "prewire" your system to achieve consistent disaster. Airliners flown by a captain who can't or won't hear the concerns of subordinates are the exact equivalent of a senior physician who does not have the benefit of critical corrections from those of lesser rank when that senior person does something human and makes a serious mistake.
Airline pilots not possessed of a medical degree do not know how to practice medicine. But we have learned the hard way how to rely on each other and create a true collegial team that focuses on safety as our common goal, and we've created leaders who take pride in how well that leader can extract, orchestrate, and utilize all the human talent available to make better decisions and catch errors long before they hurt someone. Physicians, and health care in general, are borrowing heavily now from this reservoir of high-cost experience, and with any luck, and soon, hopefully, corporate America will wake up to the same principles.
John J. Nance, ABC News' aviation analyst, is a veteran 13,000-flight-hour airline captain, a former U.S. Air Force pilot and a lieutenant colonel in the Air Force Reserves. He is also a New York Times best-selling author of 17 books, a licensed attorney, a professional speaker, and a founding board member of the National Patient Safety Foundation. A native Texan, he now lives in Tacoma, Wash.