Effective Science Communication and Countering Misinformation
Summary
- Subgroup heterogeneity: Tailor any intervention to the target audience. Take into account social, political, historical context.
- Prioritization and risk-benefit analysis: Some low-impact misinformation may be best left alone. Be mindful of reinforcing a rhetorical frame created by someone else.
- Inoculation: Interventions before exposure to misinformation appear generally effective, but less so for highly polarized issues.
- Debunking: Can reduce belief in misinformation, but usually only short-term (continued influence effect), thus must be repeated often. Avoid needless repetition of the misinformation during interventions.
- Exposure effect: Familiarity increases belief. Debunk misinformation and state the facts repeatedly.
- Avoid cognitive gaps: Convey the problem and the solution; debunk the misinformation and provide the alternative explanation. Corrections can be particularly successful if they explain the motivation behind an incorrect report.
- Reaffirmation: Frame evidence in a worldview-affirming manner by endorsing (some) values of the recipient.
- Tone: Avoid scare tactics and authoritative condescension. Recipient should be empowered to make an informed decision and not feel manipulated. Listen to concerns and ask open-ended questions.
- Science literacy: Explaining the “how” of science, not only the “what”.
- Scientific consensus: Underscoring the degree of scientific consensus.
- Relatability: Data alone may numb the audience—make it relatable, local, personal. Tell stories, but grounded in facts.
Disputed:
Inoculation preexposure warning: Carey et al. (2025) found that inoculation interventons without forewarnings performed better than with, in contrast to recommendations by Lewandowsky et al. (2020). Forewarnings may trigger skepticism of fact-checking perceived as partisan.
Clarity and detail: Simple language and clear statement of facts are more cognitively attractive than complicated refutations (Lewandowsky et al. 2012). However, a meta-analysis from Chan and Albarracín (2023) shows a bigger effect size for detailed corrections (refutations). Personal comment: detailed corrections may be more likely to fill the “cognitive gap” created by debunking, but I assume this depends on context. Nonetheless, the detailed correction should remain accessible and avoid jargon (Lewandowsky et al. 2020).
Backfire effects: A backfire effect is where a correction inadvertently increases belief in, or reliance on, misinformation relative to a pre-correction or nocorrection baseline. Backfire effects occur only occasionally and the risk of occurrence is lower in most situations than once thought.
General sources
Countermeasures for Mitigating Digital Misinformation: A Systematic Review
IPIE (2023)
Using psychology to understand and fight health misinformation: An APA consensus statement
American Psychological Association (2023)
Journalist Field Guide: Navigating Climate Misinformation
Science’s reform movement should have seen Trump’s call for ‘gold standard science’ coming, critics say
Cathleen O’Grady, Science Insider, 10. June 2025
[Berna] Devezer [University of Idaho metascientist] says reformers could learn from geneticists, who have long grappled with racists who weaponize and distort their findings. In both fields, “mostly good-faith people are making very strong claims that can easily be picked up by people in bad faith and used to achieve their ideological goals,” says Kevin Bird, an evolutionary biologist at the University of California, Davis. […] Many geneticists have learned to write defensively, anticipating how their work could be distorted. They frame their papers carefully, understanding that many readers may never see more than the abstract, which makes it unhelpful to bury caveats deeper in the paper or in separate blog posts, Bird says.
A systematic review of communication interventions for countering vaccine misinformation
Whitehead et al. (2023)
Some strategies, such as scare tactics, appear to be ineffective and may increase misinformation endorsement. Communicating with certainty, rather than acknowledging uncertainty around vaccine efficacy or risks, was also found to backfire. Promising approaches include communicating the weight-of-evidence and scientific consensus around vaccines and related myths, using humour and incorporating warnings about encountering misinformation. Trying to debunk misinformation, informational approaches, and communicating uncertainty had mixed results.
How to speak to a vaccine sceptic: research reveals what works
Pearson (2025)
- Listen, don’t judge
- Enable informed decision-making, don’t tell people what to do
- Share personal experiences
- Acknowledge uncertainties around new vaccines
Metascience can improve science — but it must be useful to society, too
Nature (2025)
If researchers and science communicators know how to transparently discuss the ‘how’ as well as the ‘what’ of science, it could help them to respond to those who try to undermine science, for example, when people wrongly say that scientists cannot be trusted if they change their minds.
Ethan Siegel, a cosmologist and science communicator, suggests that focusing on asking people to master scientific and quantitative skills is less important than making all citizens better understand “what the enterprise of science actually is,” while “fostering an appreciation for how applying the best-known science to our societal problems positively impacts all of us.” (Mann and Hotez 2025)
Inoculation
Inoculation (aka prebunking): intervention before exposure to misinformation.
In the US, [retrospective corrections and prebunking messages] each immediately increased election confidence and reduced fraud beliefs, with prebunking showing somewhat more durable effects. […] Prebunking again increased confidence and decreased fraud beliefs but only when the forewarning was omitted, suggesting that novel factual information is responsible for the observed effects of the prebunking treatment. […] The forewarning message may have triggered skepticism among Republicans, who predominantly regard fact checkers as unfair and partisan (in contrast to Democrats who typically regard fact checkers as fair). (Carey et al. 2025)
Research shows that prebunking as prior “vaccination against misinformation” is even more effective than countering it afterward. […] Its efficacy has been demonstrated for many topics, and it can reduce susceptibility to misinformation in different contexts and cultures. (Kessler 2025)
As misinformation is hard to dislodge, preventing it from taking root in the first place is one fruitful strategy. Several prevention strategies are known to be effective. Simply warning people that they might be misinformed can reduce later reliance on misinformation. […] The process of inoculation or “prebunking” includes a forewarning as well as a preemptive refutation and follows the biomedical analogy. […] The effectiveness of inoculation has been shown repeatedly and across many different topics. (Lewandowsky et al. 2020)
Inoculating against science denial
John Cook, The Conversation
Information Integrity about Climate Science
Elbeyi et al. (2025)
Although one concern has been that this strategy might backfire, reinforcing opinions that are out of step with the scientific consensus, a range of studies have found inoculation to be an effective approach to countering misleading information. Research has further suggested that communication of the scientific consensus, in combination with warnings about the circulation of politically and commercially motivated misleading information, helps support information integrity about climate science. [cf. (Lewandowsky, Gignac, and Vaughan 2013)]
It should be noted, however, that other research has questioned the effectiveness of inoculation, depending on the issues and domains in question. One study showed that while corrective messages influenced individuals’ perceptions of HPV vaccination, there was no evidence of an effect on more partisan or polarizing issues such as gun control and climate change
Debunking
Busting myths: a practical guide to countering science denial (John Cook, The Conversation)
Debunking Handbook 2020
Lewandowsky et al. (2020)
Continued influence effect: The continued reliance on inaccurate information in people’s memory and reasoning after a credible correction has been presented.
Illusory truth effect: Repeated information is more likely to be judged true than novel information because it has become more familiar.
Misinformation can also be intentionally suggested by “just asking questions”; a technique that allows provocateurs to hint at falsehoods or conspiracies while maintaining a facade of respectability.
Debunkers should also be mindful that any correction necessarily reinforces a rhetorical frame (i.e., a set of “talking points”) created by someone else. You cannot correct someone else’s myth without talking about it. In that sense, any correction—even if successful—can have unintended consequences, and choosing one’s own frame may be more beneficial. For example, highlighting the enormous success and safety of a vaccine might create a more positive set of talking points than debunking a vaccine-related myth.

To date, only three factors have been identified that can increase the effectiveness of retractions: (a) warnings at the time of the initial exposure to misinformation, (b) repetition of the retraction, and (c) corrections that tell an alternative story that fills the coherence gap otherwise left by the retraction. […] Corrections can be particularly successful if they explain the motivation behind an incorrect report. (Lewandowsky et al. 2012)
Since research has shown that people tend to interpret new information in a biased way based on their cultural predispositions, values and worldviews, scientists and other experts should consider the importance of value similarity when communicating their findings (Kahan, Jenkins-Smith & Braman, 2011). For example, they could provide information in a way that causes people to feel that their values are being affirmed rather than threatened (G. L. Cohen, Aronson, & Steele, 2016; Kahan, 2010). (Cologna and Siegrist 2020)
A Systematic Review of Communication Interventions for Countering Vaccine Misinformation
Whitehead et al. (2023)
A meta-analysis of social media-based health misinformation corrections found that debunking is an effective strategy – but that correcting misinformation related to infectious diseases is more difficult, which may explain the more heterogeneous effects found in this review
Conspiracy theories and misinformation in digital media: An international expert assessment of challenges, trends, and interventions
Mahl et al. (2025)
[…] journalists and fact-checkers, in particular, emphasized that limited resources and low return on investment for fact-checking initiatives are key challenges. Some experts – practitioners and scholars alike – emphasized that debunking messages often do not reach their intended audiences, as conspiracy theories or misinformation are disseminated through (semi-)public or private platforms, while evidence refuting false information is either distributed through other channels or inaccessible due to paywalls. In addition, some experts warned that actors who compose debunking messages often lack subcultural knowledge.
However, a recent meta-analysis of science-relevant misinformation (including health) found that corrections were, on average, not effective (Chan and Albarracín 2023), though the average masks substantial variation in effectiveness across studies and designs. (American Psychological Association 2023)
Backfire effects
Ten years ago, scholars and practitioners were concerned that corrections may “backfire”; that is, ironically strengthen misconceptions rather than reduce them. Recent research has allayed those concerns: backfire effects occur only occasionally and the risk of occurrence is lower in most situations than once thought. (Lewandowsky et al. 2020)
I’ve come to the conclusion that both sides are correct, but misleading on their own. Motivated reasoning – and even belief polarization and backfire effects – do occur. These are real processes. What’s important to know is that in the seminal studies, these effects were often confined to those individuals who held the strongest or most extreme beliefs on an issue – and that’s probably right. But they don’t describe the average individual in an average situation. It’s not our default state. There often needs to be some contextual trigger that decreases people’s motivation to be accurate and increases their motivation to belong, affiliate, and prioritize political commitments. That’s a positive thing. On the negative side, there are more and more situations which ramp up these motivations quickly, including divisive social contexts such as online echo chambers, filter bubbles, and digital wildfires. (Van der Linden 2023)
Misinformation and Its Correction: Continued Influence and Successful Debiasing
Lewandowsky et al. (2012)
- Worldview Backfire Effect: Evidence that threatens worldview can strengthen initially held beliefs -> 1) Frame evidence in worldview-affirming manner by endorsing values of audience; 2) Self-affirmation of personal values increases receptivity to evidence.
- Overkill Backfire Effect: Simple myths are more cognitively attractive than complicated refutations -> Use fewer arguments in refuting the myth — less is more
- Familiarity Backfire Effect
Message form
Text vs visuals.
- A meta-analysis on correcting health misinformation on social media found no significant difference between text messages and information that combined text and images. (Walter et al. 2021)
- IPIE Climate Report (Elbeyi et al. 2025) cites two studies claiming benefit of visuals:
Visual communication in still and moving images is a key feature of contemporary information environments and can be mobilized to stimulate engagement and effect. One contribution [101] argued that well-supported arguments and accessible forms of communication are not mutually exclusive for enhancing public engagement. For example, images of retreating glaciers can be more effective than abstract statistics in conveying the reality of climate change [101]. In a similar vein, one study [247] highlighted visual storytelling through art and media as a mode of nurturing audiences’ long-term emotional involvement in the natural environment.
Well-designed graphs, videos, photos, and other semantic aids can be helpful to convey corrections involving complex or statistical information clearly and concisely. (Lewandowsky et al. 2020)
Message detail
[…] to maximize efficacy, corrections should provide detailed arguments rather than simple denials. (Chan and Albarracín 2023)
This contradicts the “overkill backfire effect” discussed in Lewandowsky et al. (2012).
Avoid scientific jargon or complex, technical language. […] The truth is often more complicated than some viral false claim. You must invest effort in translating complicated ideas so they are readily accessible to the target audience—so they can be easily read, easily imagined, and easily recalled. Lewandowsky et al. (2020)
Emotional appeal
Fear-based vs hopeful
- A meta-analysis on communication interventions for countering vaccine misinformation that scare tactics appear to be ineffective and may backfire and increase misinformation endorsement. (Whitehead et al. 2023)
Information Integrity about Climate Science
Elbeyi et al. (2025)
Hope, in particular, stands out as a potentially helpful component of education about climate change. Highly negative representations fuel hopelessness and have been found to deactivate and demotivate students and young people [268], [269]. Importantly, however, studies of hope and climate agency have identified two varieties: constructive hope, recognizing the outcomes of taking action, and denial hope, implying that climate change does not represent a significant human problem.
Hopium or empowering hope? A meta-analysis of hope and climate engagement
Geiger, Dwyer, and Swim (2023)
Hope-related associations with climate engagement:
- Positive effect: hope related to solutions, specific actions, and personal efficacy
- No effect: hopeful messages promoting a sense of possibility that society can address climate change
- Negative effect: hope grounded in denial of climate change
Empowering Climate Action: Three Steps to Effective Communication
Verified for Climate
- Use authoritative scientific information
- Convey the problem and the solutions
- Mobilize action
Hope: The Most Compelling Case for Change
Verified for Climate
Stories told by communities in their own way resonate more widely on a personal, emotional and cultural level – a tactic bad actors have become adept at exploiting.
Individual-level vs systems-level interventions

It is possible that system-level interventions could be more effective than individual-level ones in curbing the spread of misinformation—for example, by reducing the harmful effects of recommender algorithms, demoting misinformation in online search platforms and social media, or removing content in predatory journals from medical databases (Swire-Thompson & Lazer, 2022). However, individual-level interventions have fewer potential ramifications for freedom of expression, and they rely less on the ability and willingness of technology companies to combat harmful content. (American Psychological Association 2023)
In the wider landscape beyond climate, “disinformation” provides plenty of programmatic grants for activities like monitoring, tracking, reporting on bad actors, and developing artificial intelligence tools to detect misinformation. A lot of this antidisinformation work is funded by Google and Meta themselves, and critics argue the field of misinformation studies is influenced by the soft power and framing of these companies. […] There are three main risks here. One, we open the door to a censorship regime — either a legal censorship regime or a de facto one through a set of content-filtering algorithms — that might not serve us in the end, when “our side” is not in control of those institutions or machinery to make them optimized for our worldview. (A Climate Disinformation Focus Takes Us the Wrong Way, Jacobin)
[Focussing on solutions that are implemented at the level of the individual] frames the misinformation problem in the individual, not systemic terms. This risks drawing attention away from policies that seek to bring about systemic change. We argue that none of the [individual-level] interventions discussed in this paper, either individually or taken together, are enough to comprehensively address and counter misinformation. (Roozenbeek, Culloty, and Suiter 2023)
Public trust in scientists
The role of trust for climate change mitigation and adaptation behaviour: A meta-analysis
Cologna and Siegrist (2020)
For example, perceptions of honest commitment can be increased through scientists’ communication being perceived as impartial rather than persuasive (Rabinovich, Morton, & Birney, 2012). Further, by having a low personal carbon footprint, climate scientists conform with expectations regarding concern and care, which has been found to strongly affect individuals’ intentions to alter their personal energy consumption (Attari, Krantz, & Weber, 2016). […] As an editorial in Nature stated, ‘scientists will be only as persuasive as they are trusted — which means that preserving and cultivating the public’s trust must be the scientific community’s top priority’ (Nature, 2010, p. 466)
Consensus communication
- Gateway belief model (Wikipedia)
- A systematic review of communication interventions for countering vaccine misinformationfinds that communicating the weight-of-evidence and scientific consensus is a promising approach (Whitehead et al. 2023).
Why Trust Science?
Oreskes (2021)
- Naomi Oreskes: Why trust scientists?
- Because of the scientific method? No.
- Because science is about the scrutiny of evidence by a collective of experts (organized skepticism).
- The reliability of modern technology, such as the modern car, results from the accumulation of 100+ years of collective work in science and technology. Our basis in trust in science is the same as our basis in trust in technology.
- But in her book, Oreskes emphasizes that trust in science that enables modern technology doesn’t translate to trust in other fields of science (e.g. climate)
- Three features are crucial to make science trustworthy: consensus, diversity, and methodological openness
Scientific training is intended to eliminate personal bias, but all the available evidence suggests that it does not and probably cannot. Diversity is a means to correct for the inevitability of personal bias. But what is the argument for demographic diversity? Isn’t the point really the need for perspectival diversity? The best answer to this question is that demographic diversity is a proxy for perspectival diversity, or, better, a means to that end. A group of white, middle-aged, heterosexual men may have diverse views on many issues, but they may also have blind spots, for example, with respect to gender or sexuality. Adding women or queer individuals to the group can be a way of introducing perspectives that would otherwise be missed.