In their damning 2008 commentary, “The Perfect Storm of Overutilization,” National Institutes of Health bioethicist Ezekiel Emanuel, MD, PhD, and Stanford economist Victor Fuchs, PhD, laid out the argument that overutilization was the most important contributor to high healthcare costs in the U.S.3 A greater volume of interventions and unnecessary costs both contributed to this overuse, the authors suggested.
Subsequent reports by Thomson Reuters in 2010, an Institute of Medicine (IOM) roundtable on evidence-based medicine in 2011, and the RAND Corporation in 2012 largely agreed. Based on its report, “The Healthcare Imperative: Lowering Costs and Improving Outcomes,” the IOM laid out a particularly sobering analogy to the degree of waste found in medical care.2 If other prices had grown as quickly as healthcare since 1945, the report estimated, a gallon of milk would now cost $48. Yet, of the $2.5 trillion spent on healthcare in 2009, the report estimated that 30%, or $765 billion, was wasted. Of that number, the report suggested that unnecessary healthcare services accounted for $210 billion, or 27%.
A Culture of “More”
What contributes to so much unnecessary overuse? Drs. Emanuel and Fuchs cite multiple factors:
- Physician training and culture;
- The fee-for-service payment structure;
- Aggressive marketing by developers of tests, drugs, and procedures;
- Defensive medicine
- A cultural preference for technological solutions; and
- A lack of transparency on the true costs of care.
The authors contend that each factor reinforces and amplifies the others, resulting in a “perfect storm of ‘more.’”
A major driver, several doctors agree, is a culture that has long embraced the “more is better” mantra. Brandon Combs, MD, assistant professor of medicine at the University of Colorado School of Medicine in Denver, puts it this way: “More information is better. More interventions are better. More scans are better. More surgery is better. More pills are better—this concept that if I’m getting more, if I’m spending more, if it costs more, then it must be kind of like a Mercedes. It must actually be better.” A collective “cultural blind spot,” he adds, leaves both doctors and patients unable to focus on anything beyond the upsides of care.
At the same time, medicine has reinforced the notion among trainees and attending physicians alike that doctors can never be wrong or miss a diagnosis.
“Diagnostic uncertainty really feeds into a system where we have ready access to lots of things,” Dr. Combs says. “We have such a supply of tests, whether that’s blood tests, whether that’s imaging tests, whether that’s access to consultations with subspecialists—we have a system that can supply whatever demand we seem to have.”
Dr. Shah calls it a “hidden curriculum” that imposes its will on doctors’ discretion. Case studies, for example, routinely focus on doctors ordering multiple tests in search of exceedingly rare causes of disease instead of being good stewards of limited resources.
“When you’re criticized by your colleagues or by your mentors, it’s always for things that you didn’t do but could’ve done, and it’s never about the things that you did do but didn’t have to,” he says.
Anthony Accurso, MD, instructor of medicine at the Johns Hopkins Bayview Medical Center in Baltimore, says the current system grew out of an apprenticeship model of medical training that dominated for much of the 20th century.
“You learn to do things the way they’ve always been done,” he says. About 20 years ago, however, healthcare providers began shifting toward evidence-based medicine. “That was a retreat from doing things the way they had always been done and a movement toward doing things that proved themselves to be effective though evidence and study,” he says.