In a world that can feel like it’s awash in misinformation—about COVID-19, politics, and Russian propaganda related to the Ukraine war for example—we all need to make more of an effort to fight the problem, an expert said in a session at SHM Converge.
“We have a lot more work to do in terms of defining what it is we’re talking about, and moving beyond the buzzwords, to thinking about careful measurement, thinking about theory,” said Brian Southwell, PhD, director of the Science in the Public Sphere Program at Research Triangle Institute International, a non-profit research organization in Durham, N.C. that works with governments, businesses, and other clients on medical, social, and other types of projects. “And we have to avoid the old problem of reinventing the wheel and overlooking a lot of work that’s already been happening.”
Dr. Southwell and his colleagues worked on creating a definition of misinformation for the Annals of the American Academy of Political and Social Science that could be useful for understanding the problem better and for blunting its damage. They settled on: “Publicly available information that is misleading or deceptive relative to best available evidence at the time and counters statements by actors who adhere to scientific principles without adding accurate evidence for consideration.”
He elaborated on a few phrases of the definition. “Relative to best available evidence at the time” is an important phrase, he said, because it gets at the dynamic nature of science.
“If we’re not careful we’re going to end up talking about misinformation in a way that rules out our scientific endeavor,” he said. “Science itself is iterative, of course, and evolves over time. So it’s quite possible that a claim that held up in 1957 has less evidence today.”
The phrase “actors who adhere to scientific principles” gets at the idea of misinformation as a competition between groups.
“We tend to think about misinformation and the capital-T ‘Truth,’” Dr. Southwell said. “A lot of what we’re often talking about in the arena of science is a contestation between that which has been produced by rigorous effort and that which lies outside of that process.” An important part of the concern about misinformation is the concern that “people aren’t playing by the rules.”
The misinformation problem, he said, can be broken down into four main points.
One is that we are “biased toward acceptance,” he said.
“The way that we tend to encounter information and misinformation is to accept it at face value,” Dr. Southwell said. “That is not to suggest that we ignore our critical capacity to then make sense of it as perhaps being true or false. But initially, we take it in and process it as though it’s true.” This basic human tendency “puts us all in the same boat,” he said.
Another issue is that there are reasons why we share misinformation that might have nothing to do with providing information to someone, but with basic social impulses, he said.
“A lot of misinformation-sharing is a matter of inadvertent, or at least unintentional, instances where people don’t necessarily want to share false information, but they’re trying to connect with other people to show that ‘You’re part of my collective identity, you’re a part of my tribe.’”
Also, Dr. Southwell said, our regulatory system emphasizes post-hoc detection of misinformation, rather than preventing its spread in the first place. Hospitalists are positioned to try to help, by connecting with patients, he said.
“You are in a spot where you are interacting with patients regularly in trusted relationships,” he said. “That’s actually where I think we ought to move forward with this. Rather than trying to vanquish misinformation from existence.”
The final point, he said, is that “correction is hard.”
“You can correct misperceptions, it’s possible to do that—so that’s good,” he said. “But it’s difficult work.” The false claim has to be very explicit, and it needs to be called out explicitly—“you have to fight fire with fire in terms of exposure.”
What can be done? The first, he said, is to “encourage compassion” for those sharing misinformation.
That “is not to suggest that misinformation isn’t a concern—of course, it’s a really important one,” he said. “But I also think that what we need to do is focus less on shaming and blaming, less on shaking our head at the outlandish thing that our uncle or cousin or friend posted.”
Experts, including doctors, should embrace the chances they have to translate scientific information. Also, he said, we need to learn what other people are encountering in the way of their information sources, and we need to empower people to seek correct information for themselves.
Also, he said, “We can try to encourage people to take a breath, pause for a second before they share.”
Dr. Southwell said that “transparency is going to be important.”
A large proportion of the country “generally thinks science is a good thing,” he said, but “we haven’t done a great job of educating as to what the process is going to look like, the fact that information and knowledge are going to evolve over time.”
Dr. Southwell suggested there needs to be a better job of “pulling back the curtain showing the humanity of health care workers.”
“What I don’t want to suggest is that you all have to be somehow superhuman and ignore all the pain and injury that comes with that. It’s just to suggest there is a way forward.”
Tom Collins is a medical writer in South Florida.