The Crisis of Information Integrity

Summary


Severity of the crisis

UN Global Risk Report 2024

The UN Global Risk Report identifies mis- and disinformation as the third most important and second* least prepared global risk (out of 28).

*In Figure 1, mis-and disinformation appears as the second least prepared global risk. However, in the data table on page 15 of the document, it appears as the fourth least prepared.

For mis- and disinformation, the primary obstacles include gaps in data, accountability and communication pathways. These gaps hinder the ability to engage on evidence-based narratives and ensure coordinated responses. […] Based on these insights, the Secretary-General commits to: Immediately create a task team to strengthen the United Nations System’s capacity to address risks in the information ecosystem. The team will focus on the effects of mis- and disinformation on United Nations mandate delivery, including through research, risk assessment and response strategies.


The wealth of studies on this phenomenon have documented its pervasive effects, showing that it is extremely difficult to return the beliefs of people who have been exposed to misinformation to a baseline similar to those of people who were never exposed to it. (Lewandowsky et al. 2012)

If evidence is not valued above opinion, then the potential for science to do good in society is severely curtailed. Media channels (new and old) offer vital fora for knowledge dissemination but without effective regulation to maintain public trust in the quality of information the dice are loaded against evidence and in favour of persuasive lies. (Health 2025)

Disinformation is a plague that now threatens us all; it was identified in a recent report of the WEF as the number-one threat our civilization faces in the years ahead. (Mann and Hotez 2025)


Cognitive biases and motivated reasoning

Crucially – and this nuance often gets lost, even in the scientific literature – there’s an important difference between selectively attending to evidence (or being quicker to accept evidence that fits your worldview) and a deep motivation to actively reject evidence because it doesn’t fit with your personal or political beliefs. […] These transition from the more ‘common’ (milder intuitive heuristics such as selective perception, and confirmation bias) to the ‘rare’ (relatively infrequent but more deliberate biases such as motivated reasoning, and belief polarisation) to full alert mode (full-blown ‘conspiracy-level theorizing’). (Van der Linden 2023)

Conspiratorial thinking

Corrosive skepticism

We might says something like “All scientists are skeptics, but not all skeptics are scientists.”” To be a scientist does require a willingness to look at seemingly settled truths and ask curious questions. But that is not all it requires. It also requires a process of testing hypotheses carefully to try to arrive at what is actually true, and abandoning a hypothesis when the evidence in front of you keeps telling you that your hypothesis is wrong (such as, for instance, the hypothesis that vaccines cause autism).

Political drivers

Organized Climate Change Denial
E. Dunlap and McCright (2011)

A staunch commitment to free markets and disdain of governmental regulations reflect the conservative political ideology that is almost universally shared by the climate change denial community. […] Since anthropogenic climate change is a major unintended consequence of fossil fuel use, simply acknowledging its reality poses a fundamental critique of the industrial capitalist economic system.


NASA Faked the Moon Landing—Therefore, (Climate) Science Is a Hoax: An Anatomy of the Motivated Rejection of Science
Lewandowsky, Oberauer, and Gignac (2013)

Rejection of climate science was strongly associated with endorsement of a laissez-faire view of unregulated free markets. (\(r \simeq .80\)) […] perceived scientific consensus is associated with acceptance of science. […]


The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science
Lewandowsky, Gignac, and Oberauer (2013)

[…] the driving psychological force that is underlying the rejection of science is “system justification”; that is, a person’s need to perceive the current political and economic system as fair, legitimate, and stable. According to the system justification view, scientific findings are rejected by people high in system justification when the evidence challenges the status quo, rather than on the basis of ideology per se.

Note

“And not surprisingly, for if science is about studying the world as it actually is—rather than as we wish it to be—then science will always have the potential to unsettle the status quo.

Merchants of Doubt, Oreskes and Conway (2011).


Asymmetric ideological segregation in exposure to political news on Facebook
González-Bailón et al. (2023)

[…] most misinformation, as identified by Meta’s Third-Party Fact-Checking Program, exists within this homogeneously conservative corner, which has no equivalent on the liberal side.


The freedom to misinform
Health (2025)

Arguments purportedly promoting a freedom of speech agenda seek to characterise fact checking as somehow partisan, such measures have recently been rolled back on Facebook following the political direction of the Trump administration. But actions speak loader than words and this free speech rhetoric doesn’t match actions to police scientific studies that mention particular words relating to sex and gender, race and disability, or the aggressive cutting of science funding.


A toolkit for understanding and addressing climate scepticism
Hornsey and Lewandowsky (2022)

Most environmentally sceptical books are published or financed by conservative think tanks; (2) funding of climate-denying think tanks and advocacy organizations is approximately US $900 million annually; and (3) the fossil fuel industry was aware of the anticipated consequences of climate change as early as 1965 but continued to deny its existence in public for decades, as revealed by a comparison of ExxonMobil’s internal and external communications.

Disinformation techniques

A history of FLICC: the 5 techniques of science denial (John Cook)


Merchants of Doubt


Constructing “Sound Science” and “Good Epidemiology”: Tobacco, Lawyers, and Public Relations Firms
Ong and Glantz (2001)

Public health professionals need to be aware that the “sound science” movement is not an indigenous effort from within the profession to improve the quality of scientific discourse, but reflects sophisticated public relations campaigns controlled by industry executives and lawyers whose aim is to manipulate the standards of scientific proof to serve the corporate interests of their clients.


There’s No Such Thing As ‘Sound Science’
Christie Aschwanden, FiveThirtyEight

Whereas the “open science” movement aims to make science more reliable, reproducible and robust, proponents of “sound science” have historically worked to amplify uncertainty, create doubt and undermine scientific discoveries that threaten their interests.

Public trust in scientists

While there is no widespread lack of trust in scientists, we cannot discount the concern that lack of trust in scientists by even a small minority may affect considerations of scientific evidence in policymaking. (Cologna et al. 2025)

The majority of the Swiss public supports scientific research and condemns attacks on science, according to the 2025 Science Barometer survey conducted by the University of Zurich. (Schäfer 2025)

References

Cologna, Viktoria, Niels G. Mede, Sebastian Berger, John Besley, Cameron Brick, Marina Joubert, Edward W. Maibach, et al. 2025. “Trust in Scientists and Their Role in Society Across 68 Countries.” Nature Human Behaviour 9 (4): 713–30. https://doi.org/10.1038/s41562-024-02090-5.
E. Dunlap, Riley, and Aaron M. McCright. 2011. “Organized Climate Change Denial.” In The Oxford Handbook of Climate Change and Society, edited by John S. Dryzek, Richard B. Norgaard, and David Schlosberg, 0. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199566600.003.0010.
Ecker, Ullrich K. H., Li Qian Tay, Jon Roozenbeek, Sander van der Linden, John Cook, Naomi Oreskes, and Stephan Lewandowsky. 2024. “Why Misinformation Must Not Be Ignored.” American Psychologist. https://doi.org/10.1037/amp0001448.
Elbeyi, E., K. Bruhn Jensen, M. Aronczyk, J. Asuka, G. Ceylan, J. Cook, G. Erdelyi, et al. 2025. “Information Integrity about Climate Science: A Systematic Review.” Synthesis {Report} SR2025.1. Zurich, Switzerland: International Panel on the Information Environment (IPIE). https://doi.org/10.61452/BTZP3426.
González-Bailón, Sandra, David Lazer, Pablo Barberá, Meiqing Zhang, Hunt Allcott, Taylor Brown, Adriana Crespo-Tenorio, et al. 2023. “Asymmetric Ideological Segregation in Exposure to Political News on Facebook.” Science 381 (6656): 392–98. https://doi.org/10.1126/science.ade7138.
Health, The Lancet Planetary. 2025. “The Freedom to Misinform.” The Lancet Planetary Health 9 (3): e169. https://doi.org/10.1016/S2542-5196(25)00058-0.
Hornsey, Matthew J., and Stephan Lewandowsky. 2022. “A Toolkit for Understanding and Addressing Climate Scepticism.” Nature Human Behaviour 6 (11): 1454–64. https://doi.org/10.1038/s41562-022-01463-y.
Kulin, Joakim, Ingemar Johansson Sevä, and Riley E. Dunlap. 2021. “Nationalist Ideology, Rightwing Populism, and Public Views about Climate Change in Europe.” Environmental Politics 30 (7): 1111–34. https://doi.org/10.1080/09644016.2021.1898879.
Lewandowsky, Stephan, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz, and John Cook. 2012. “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest 13 (3): 106–31. https://doi.org/10.1177/1529100612451018.
Lewandowsky, Stephan, Gilles E. Gignac, and Klaus Oberauer. 2013. “The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science.” PLOS ONE 8 (10): e75637. https://doi.org/10.1371/journal.pone.0075637.
Lewandowsky, Stephan, Klaus Oberauer, and Gilles E. Gignac. 2013. NASA Faked the Moon LandingTherefore, (Climate) Science Is a Hoax: An Anatomy of the Motivated Rejection of Science.” Psychological Science 24 (5): 622–33. https://doi.org/10.1177/0956797612457686.
Mann, M., and P. Hotez. 2025. Science Under Siege: How to Fight the Five Most Powerful Forces That Threaten Our World. Scribe Publications Pty Limited. https://books.google.ch/books?id=FUVGEQAAQBAJ.
Ong, Elisa K., and Stanton A. Glantz. 2001. “Constructing Sound Science and Good Epidemiology: Tobacco, Lawyers, and Public Relations Firms.” American Journal of Public Health 91 (11): 1749–57. https://doi.org/10.2105/ajph.91.11.1749.
Oreskes, Naomi, and Erik M Conway. 2011. Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. Bloomsbury Publishing USA.
Oreskes, Naomi, and Erik M. Conway. 2010. “Defeating the Merchants of Doubt.” Nature 465 (7299): 686–87. https://doi.org/10.1038/465686a.
Schäfer, Mike S. 2025. “Wissenschaftsbarometer.” University of Zurich.
Supran, G., S. Rahmstorf, and N. Oreskes. 2023. “Assessing ExxonMobil’s Global Warming Projections.” Science 379 (6628): eabk0063. https://doi.org/10.1126/science.abk0063.
United Nations. 2025. “Global Risk Report.” United Nations. https://unglobalriskreport.org/.
Van der Linden, Sander. 2023. Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity. WW Norton & Company.