You work in a small rural hospital. In one year, you admit six patients with acute myocardial infarctions (AMI). You follow CMS and Hospital Quality Alliance guidelines for the eight process measures for AMI, and your hospital scores 100% for that year.
A neighboring hospital isn’t as lucky: One of its four AMI admits, a 99-year-old man, refuses a beta blocker at discharge. What could have been a perfect score (a beta blocker prescribed four out of a possible four times, or 100%), is now 75%.
A study released in June by Duke University Medical Center elucidates the challenges faced by small hospitals when they report performance measures. Smaller hospitals, according to the study, are more likely to rate as top performers when reporting on the eight AMI process measures.1 However, the authors conclude, reports such as those required by Medicare, which ignore denominator size when assessing process performance, can unfairly reward or penalize hospitals.
“The scores can be very misleading,” says Randy Ferrance, DC, MD, a hospitalist at the 67-bed Riverside Tappahannock Hospital in Tappahannock, Va. “If we miss aspirin on discharge for one patient and everything else was perfect, we have the potential to slide into a lower percentile, whereas larger hospitals can miss aspirin at discharge and do just fine.”
Small Denominators, Big Differences
Doug Koekkoek, MD, is in a unique position to see how performance and quality metrics vary by hospital size. As chief medical officer of the Providence Hospitalist Programs in Oregon, Dr. Koekkoek oversees two tertiary facilities, Providence Portland Medical Center (483 beds) and Providence St. Vincent Medical Center (523 beds), as well as a 77-bed community hospital (Providence Milwaukie Hospital), a 40-bed community hospital (Providence Newberg Medical Center), and a 24-bed critical access hospital (Providence Seaside Hospital).
“When we do a roll-up, looking at our appropriate care score, which looks at all the CMS metrics for AMI, congestive heart failure, and pneumonia, we can see that in the bigger institutions, where you have a much bigger denominator of patients who qualify for each diagnosis, the trends are fairly even,” Dr. Koekkoek says. “But in the smaller hospitals, there is much greater variability.”
Rather than focus on each month’s scores, he looks at trends for several months to get a better sense of how his hospitals rate. “You can run at 100% on the heart-failure measures for nine months and then, if your denominator is 10 cases in a quarter and you miss only two or three of the measures, all of a sudden, you’re in the 80% or 70% performance percentile,” he says. “You don’t get a full picture unless you’re looking back over the last six, eight, or 10 months.”
The American Hospital Association (AHA) recommends presenting data to consumers in the same way. “We encourage our hospitals to not let the data themselves tell the story, but to help set them in context and portray to the communities they serve exactly what the data mean,” says Nancy Foster, AHA’s vice president for quality and patient safety.
Foster concedes the issue raised in the Duke study, that quality data don’t reflect low case volumes, has plagued the data-reporting process, but the AHA believes the process should continue. “We firmly believe that all hospitals ought to be sharing good, reliable information on the quality of care they’re providing with the communities they serve,” she says.
Document Challenges
Conveying an accurate representation of your hospital starts with appropriate documentation, says Christian Voge, MD, a hospitalist with Central Coast Chest Consultants, which provides coverage to Sierra Vista Regional Medical Center and French Hospital Medical Center in San Luis Obispo, Calif.