COMMENTARY || To win the fight against health and wellness bunk, we must leave the post-truth era in the past
Health-trend debunker Timothy Caulfield offers a four-point prescription to help science and critical thinking “rise above the noise of nonsense.”
By TIMOTHY CAULFIELD
While pseudoscience and quackery have been around a long time, the 2010s were truly the decade of bunk. The reach and influence of misinformation has intensified to the point that it feels near impossible to find the truth in the churning sea of falsehoods, exaggerated claims and fear mongering.
Social media advertising pushes anti-vaccine myths, celebrity health brands aggressively sell rubbish ideas and products, health-care providers and research institutions hype unproven therapies and there are wild conspiracy theories about everything from GMOs to fluoride to milk. And the media reporting on all these topics often adds more confusion than clarity.
As someone who studies the public representations of science, I found the decade both exhausting and a bit depressing. But there is good news. More and more entities—including governments, universities and professional organizations—are recognizing the importance of this issue. This year, the World Health Organization declared the spread of vaccine misinformation one of the top threats to public health.
The 2010s have been called the post-truth decade. What can we do to make the next decade more, if only a little, about the truth? What can we do to create a cultural shift that allows science and critical thinking to rise above the noise of nonsense? While this is obviously a complicated issue that will require us to deploy a host of strategies, here are a few to get us started.
Remember that facts still matter
Correcting misinformation is a complex and increasingly difficult endeavour. Falsehoods and exaggerated claims are injected into an information environment that is already clouded by a tangle of values, ideological agendas and preconceived ideas about what is healthy and what is not. As such, merely making the science-informed facts publicly available—be it about the value of vaccines or the uselessness of detox diets—will often have little impact.
A body of evidence suggests that just correcting misinformation (debunking) will not change minds and may even cause some to become more entrenched in their misplaced views. While the influence and prevalence of this “backfire effect” is frequently overstated, its existence highlights how challenging the battle against misinformation can be.
In addition, because of our strong tendency to consume information that confirms our beliefs—a psychological phenomenon called the confirmation bias—we often do not even see, read or consider alternative views.
But despite these and many other psychological barriers that can make us less than receptive to evidence that might correct misinformation, presenting people with the facts can still make a difference. A 2015 study, for example, found that emphasizing the strength and breadth of the scientific consensus on a topic is an effective strategy, perhaps because this helps to correct perceptions of false balance (that is, the perception that the evidence on either side of an issue is more balanced or contested than it actually is). A 2019 study found that not responding publicly to science deniers on topics such as vaccination can have a negative effect on public beliefs and actions. The silence leaves inaccuracies unchecked. But the researchers also found that a fact-filled rebuttal that corrects specific inaccuracies can make a difference. The study found that not engaging on the issue—that is, silence from the experts—results in "the worst effect.”
So, yes, while facts alone will often not be enough, facts still matter. We should not shy away from battling bunk with the good science. But how we provide that science also matters.
Recognize the power of a good story
There is some evidence that humans are biologically predisposed (thanks, evolution) to respond to stories and anecdotes. This is one reason misinformation can have such a persuasive punch: it is often wrapped in a compelling narrative. And, unfortunately, those pushing bunk health products and ideas are particularly adept with anecdotes. They are used to hawk all sorts of science-free hokum, such as miracle cures for debilitating diseases, celebrity diets and anti-vaccine fear mongering.
In many ways, social media are platforms for sharing personal narratives. Even if you don’t want to, it is easy to come across a post reflecting on a personal experience that, intentionally or not, pushes a science-free position. For example, a recent study found that even though Instagram has more pro-vaccine posts, the anti-vaccine ones have more engagement. And this is because, at least in part, the opponents of vaccines are more likely to use powerful narratives, usually about harm.
These kinds of health-related anecdotes leverage several of our hard-wired psychological tendencies: the negativity effect (we respond more strongly to negative than positive information), the availability bias (dramatic examples that are easy to recall can be more influential), and our natural attraction to a relatable story.
Anecdotes are also often used as the primary rationale for science-free health-care services. A study I did with my colleague Alessandro Marcon found that those arguing for alternative medicine—in this case, chiropractic services—most often support their position with anecdotes, rather than science. Unfortunately, despite the fact that anecdotes should not be considered good evidence, they can be very convincing, as they can interfere with our ability to think scientifically.
Given this reality, the battle against health misinformation will require science advocates to use a variety of engaging and creative communication strategies, including stories, images and art, and shareable messages that are social-media friendly (remember MediaSmarts’ House Hippo campaign?). Science needs to be inserted into the broader conversation in a way that will allow it to compete with the narrative-filled misinformation circulating in popular culture. Absorbing and entertaining science stories are everywhere. Let’s use them.
Encourage vigilance and critical thinking
While correcting misinformation is essential, the best way to stop it from having an adverse impact on public health is to encourage the application of critical thinking. Studies have consistently found that it is possible to teach such skills, even to the relatively young, and that this can help to inoculate individuals from the sway of health bunk. This should include providing basic tools to evaluate claims of efficacy, such as the reality that an anecdote or a testimonial is not good evidence, no matter how compelling.
Let’s encourage a culture of fact-checking and a reverence for accuracy. We need to constantly remind ourselves (and others) to think before sharing. In most situations, people do not intentionally spread misinformation because they have some malevolent agenda (although this certainly happens). In the kerfuffle of our daily lives, we are simply too distracted to pause and consider, for instance, the veracity of that post that claimed tanning your bum is health-enhancing (an actual story, and, no, it’s not a good health strategy). However, a recent study found that simply reminding people to think about the concept of accuracy can increase the quality of the news they share (please consider this a reminder).
Embrace better science, better health care and the value of trust
Rising public distrust of science institutions (44 per cent of Canadians think scientists are elitist), ideological polarization and frustration (right or not) with the health-care system has created an environment that has allowed misinformation to thrive. And, of course, the spreading of misinformation has facilitated the growth of these kinds of sentiments, making people even more distrustful and receptive to misleading health information. A destructive feedback loop is creating a science-sucking vortex that is pulling us into an “all knowledge is relative and not to be trusted” Dark Age.
So while we need to fight health misinformation with creative communication strategies and critical thinking, we also need to tackle the systemic issues that make the misinformation so intuitively appealing and believable. When people feel as if conventional health-care providers ignore them, or they hear about pharmaceutical scandals, it is much easier for them to believe stories about the efficacy of alternative therapies and conspiracy theories about Big Pharma. When regulated health professionals are allowed to market unproven therapies, it may seem hypocritical to condemn the pseudoscience pushed by celebrity wellness gurus.
Good research, robust oversight and scientific integrity are essential to the struggle against misinformation. Without good science and public trust in that science, I’m not sure if the fight against misinformation is winnable.
Timothy Caulfield is a Canada Research Chair in Health Law and Policy at the University of Alberta and host of the Netflix documentary series A User’s Guide to Cheating Death.
This opinion-editorial originally appeared Jan. 6 in The Globe and Mail.