For hospitalists, human factors knowledge is most useful in process improvement, says John Gosbee, MD, MS, a human factors engineering and healthcare specialist at the University of Michigan. Dr. Gosbee, who has worked with hospitalists in Ann Arbor and around the country, originally studied aerospace medicine, pursued a subspecialty in occupational medicine, and from 1988 to 1992 worked at NASA designing space hospitals. In the dozens of lectures and workshops he has conducted, he has learned numerous physicians resist learning about HF. At first they protest, claiming they “didn’t go to medical school to become engineers” or “weren’t hired to have you tell us we need to be some kind of designer or computer-science software evaluator.”
Dr. Gosbee couldn’t agree more, but after the a-ha! moment, usually in an interactive experience when the hospitalist sees a poor system design is an obstacle to safety and process flow, they open up to adopting the HF mindset. Once on board with HF, hospitalists are quick to translate the theories to their own practices, identifying potential vulnerabilities and risks.
Manufacturers of healthcare equipment and systems don’t want to hear from “safety geeks,” Dr. Gosbee says; the companies want to hear from front-line providers who regularly use the products. “Hospitalists are in great position to provide that input because they see what happens across a broad swath of hospital settings,” he says, “and they could amalgamate the fact that everyone across specialties is having some trouble with this computer screen or new infusion device.”
Dr. Gosbee’s first-hand knowledge and experience solving hospitalist issues with HF techniques evolved into a teaching career. He says the university administration supports his belief in the practicality of HF lessons, and he now works as the lead instructor for a majority of the university’s medical residents.
“Human factors engineering is an efficient way to flip people’s brains around 180 degrees toward systems thinking,” Dr. Gosbee explains, “which is required if the organization wants to become a high-reliability organization.”3
Examples in Medicine
Russ Cucina, MD, MS, hospitalist at the University of California San Francisco Medical Center, describes a practical example of human factors engineering in a simple, widely used design. When cars ran on leaded gasoline, the design of the leaded gas pump nozzle precluded it from being inserted into an unleaded gas tank. “Even though one was clearly labeled leaded and other unleaded, human beings are bad at catching those things, especially when they’re in a hurry and under stress,” says Dr. Cucina, whose research includes clinical human-computer interaction science with an emphasis on human factors and patient safety.
A similar concept is what is missing from the Swan Ganz catheter design. The three ports (proximal, middle, and distal) connecting the catheter to the ICU monitor all have the same shape, making it easy to errantly connect one or more to the wrong port. “You’d think the manufacturers would shape the connectors in a way that would preclude incorrect connections,” Dr. Cucina says, “but that has not been done. We leave it to the vigilance of the bedside nurse or intensivist or hospitalist to hook these up correctly, rather than redesigning them so that cannot be done incorrectly.”
One way to think about human factors engineering is to think about forcing “a round peg into a square hole.” In the hospital setting, round pegs into square holes equate to errors. HF tries to solve the issue (round peg into a round hole, and vice versa). “Were you to apply human factors to the Swan Ganz catheter port connectors,” Dr. Cucina says, “you’d have round into the round hole, square into the square, and triangular into the triangular. You’d have no choice but to do the right thing.”