Round One
“Plane down, mass casualties possible; initiate disaster plan.”
The page interrupted my evening out with friends a few Saturday nights ago. Looking up from my dinner, I noticed the restaurant television had cut away to a news story at Denver International Airport. Continental Flight 1404, en route to Houston, had crashed during takeoff, belly-flopping to a fiery rest a few hundred yards off the runway. The airport is about 10 miles from the nearest hospital—mine.
The situation ended considerably better than originally expected. Thirty-eight people were treated at several Denver hospitals, 11 of them at my hospital, with most patients discharged from the emergency department. No one died. The case remains under review, and little is known about the cause of the crash.
Round Two
“Give me a call. I need to talk to you urgently.”
That page arrived the following Monday morning. It was from a co-worker. There had been an unexpected bad outcome in a young male patient. The hospital’s quality and risk management group had found out about the case and called for a peer review. My colleague was scared; would he be publicly criticized? Punished? Fired?
The patient had been admitted with a chronic disease flare-up. He was on the mend after receiving an increased dose of medication. The night before he was scheduled to be discharged, he developed a new symptom, was evaluated by the cross-cover team, and a plan was set in motion. However, a critical lab result, which became available overnight, mistakenly was not called to the provider and went unnoticed by the primary team that triaged the patient to the end of the team’s rounds. By then, he was in extremis.
Planes and Patients
The proximity of these two events provoked comparisons.
By now, comparing healthcare to the aviation industry has become cliché. Both industries demand highly trained and skilled conductors; errors in both industries can result in death; both depend on technology; and both have turned to systems engineering to improve efficiencies and reduce mistakes. This is where the two industries diverge, and I think we get it wrong in medicine.
In aviation, there are very proscriptive algorithms that must be followed, and much of a pilot’s work is under constant scrutiny by air traffic controllers and data recorders. A deviation in protocol rarely goes unnoticed. Errors are systematically compiled, scrutinized, and compartmentalized, with the aim of further refining systems to reduce the likelihood of future errors. Although blame is often prescribed, it is in the context of improving the system. Thus, the aviation industry is awash with data to inform and fuel its systems engineering.
Meanwhile, in medicine our indelible sense of autonomy breeds variability, which is not only tolerated, but often goes unnoticed. Further, we employ a model of error analysis that focuses on affixing blame, as if somehow culpability will prevent future errors. Someone made an error, a bad outcome ensued, and the culprit must be identified and punished. This results in reprimand, remediation, or banishment from the medical staff. At times, this is an appropriate response, as some errors are so egregious or indicative of a chronic problem. More often, the punitive process misses the mark because it focuses on blame instead of prevention of the next error. Unlike the aviation industry, this leaves medicine bereft of data for improving our care systems.