How We Can Protect Ourselves Against Health Misinformation
By Tyler Morris • December 23, 2025
1 – Introduction: From Diagnosing the Problem to Building Cognitive Defenses
Public understanding of nutrition and health is shaped as much (if not more) by narratives and claims on social media as it is by scientific evidence. In Part 1 of this series, I outlined how fear-driven claims, political rhetoric, and the misrepresentations of evidence can lead to the spread of inaccurate information. Much of the discourse surrounding nutrition, food, and health is sensationalized and fixates on minor or exaggerated risks, placing less attention on the well- established drivers of chronic disease. The “Make American Healthy Again” (MAHA) movement exemplifies this pattern by often framing health problems around simplistic scapegoats rather than broader dietary, behavioral, and systemic factors.
In Part 2, here I shift towards cognitive defense against misinformation. Rather than proposing policy reforms (see future Part 3), this article examines evidence-based cognitive and educational strategies that can help protect individuals from misinformation. Drawing from behavioral, psychological, and communication research, I discuss approaches such as prebunking, debunking, and the importance of media, health, digital, and scientific literacy. These approaches are not a comprehensive solution by themselves, nor do they eliminate the need for systemic change regarding our healthcare system, food environment, information environment, and social media platforms. Instead, they represent some foundational tools and skills for navigating an increasingly complex (mis)information environment. In Part 3, I will build on this discussion by turning explicitly to the structural drivers of poor health and the evidence-based policies and solutions needed to address them.
2 – Cognitive Defense: Winning the (Mis)Information War
Before discussing some cognitive defenses against health misinformation, let’s first define misinformation. While there are several slightly different definitions of misinformation – since it is typically referred to as an umbrella term for many concepts and used differently depending on the context – I will use a common, simple definition adapted from social scientist Matthew Facciani (2025): misinformation constitutes any misleading, inaccurate, or outright false information, and can be unintentionally or intentionally spread (p. 9). The latter is commonly referred to as “disinformation”; however, intent is extremely difficult to determine, so in this discussion I will use misinformation as a broader term.
*I also want to share another, more comprehensive definition of misinformation that I discovered in a review article by Philipp Schmid (2025), originally from Southwell et al., 2022 quoted here: “that is misleading or deceptive relative to the best available scientific evidence or expertise at the time and that counters statements by actors or institutions who adhere to scientific principles without adding accurate evidence for consideration”. This more comprehensive definition adds value by emphasizing that misinformation is not only defined by factual inaccuracy, but by how claims relate to the best available scientific evidence at a given time. This distinction is especially important in nutrition and health where uncertainty and evolving evidence are common – known as “midinformation” – not necessarily false or true information, just evolving and incomplete information (Facciani, 2025, p. 16).
Addressing misinformation, on any topic, requires an understanding of the cognitive mechanisms that allow falsehoods to spread, and the psychological interventions that can minimize their spread and harm. To improve accurate public understanding of health and nutrition information, we must leverage insights from behavioral, psychological, and communication research.
2.1 – Prebunking: Inoculation Against Misinformation
Just as vaccines prime our immune system to resist viruses, psychological “inoculation” can prime our minds to prevent falling for misinformation (Lewandowsky et al., 2020). This approach, more broadly known as “prebunking,” exposes people to a weakened version or “microdose” of inaccurate information, or the manipulative strategies commonly used to spread misinformation (Roozenbeek and Van der Linden, 2022). The former is known as issue-based inoculation – targeting a specific misleading claim or topic. And the latter is known as technique-based inoculation, building broader, more flexible “immunity” as these techniques appear across many misinformation topics (e.g., logical fallacies, conspiratorial thinking, or emotionally charged rhetoric, etc.) (Roozenbeek and Van der Linden, 2021).
In the existing literature, there are many reasons why prebunking is believed to work related to fundamental features of human cognition. One such theory, by Guangchao Charles Feng (2025), is that psychological inoculation works because it mirrors the sequence of the biological immune system: an initial alert, a safe rehearsal, and the retention of defenses. A forewarning (mentioning the information or claim to be shared is inaccurate) heightens vigilance, much like an innate immune system, while exposure to a weakened misinformation tactic paired with refutation allows people to learn about the tactics and false information in a low-stakes environment. This then lowers susceptibility to misinformation and the risk of spreading it due to the cultivation of “cognitive antibodies,” much like how vaccines lower the risk of virus contraction and transmission (Lewandowsky et al., 2020).
Rooted in the seminal Inoculation Theory developed by social psychologist William J. McGuire (1964), modern misinformation prebunking research has transitioned from theoretical social psychology into scalable digital interventions. Prominent researchers have demonstrated that exposing people to “microdoses” of misleading tactics can effectively build psychological resistance to manipulation (i.e., strengthening the “psychological immune system”) (Roozenbeek et al., 2020). An online game called Bad News places players in a fictional social media environment where they play the role of a misinformation producer, deploying common misinformation techniques, such as trolling and using fake accounts. Across large cross-national samples, gameplay reliably reduced the perceived credibility of misinformation and improved players’ ability to spot misinformation, while maintaining trust in legitimate content, indicating improved discernment rather than generalized skepticism (Roozenbeek and Van der Linden, 2021). Notably, these effects emerged consistently across countries and demographic groups, suggesting that interactive, technique-based inoculation can function as a broad-spectrum and culturally robust tool for strengthening public resilience to misinformation.
Another example is the COVID-19 misinformation inoculation game, Go Viral!, also developed by Roozenbeek and Van der Linden. It shows strong, topic-specific effectiveness in helping people recognize manipulation techniques used to spread COVID-19 misinformation (Basol et al., 2021). Across English, French, and German samples, a brief five-minute gameplay session made participants more likely to view misleading COVID-19 content as manipulative, increased their confidence in identifying misinformation, and reduced their willingness to share it. The gains in perceived manipulativeness and confidence persisted for at least a week after exposure (Basol et al., 2021). In this same study, the researchers also found that an intervention consisting of infographics about COVID-19 misinformation also improved users’ ability and confidence in spotting misinformation, although the effects were much smaller compared to the
game (Basol et al., 2021). This pattern is consistent with a recent meta-analysis of 33 inoculation experiments which found that interactive prebunking formats – particularly game- and video-based interventions – produce strong improvements in distinguishing accurate from inaccurate news without inducing generalized skepticism (Simchon et al., 2025). All together, these findings suggest that inoculation can be scaled effectively across different demographics and domains while avoiding the unintended consequence of eroding trust in legitimate information.
2.2 – Debunking: Catching and Correcting Misinformation After it has Spread
Unfortunately, we can’t prebunk everything. Misinformation is “sticky,” so myths will spread and debunking them then becomes necessary (Lewandowsky et al., 2020). Simply put, debunking is when false information is corrected after it has been shared (Schmid, 2025). However, these corrections sometimes fail to completely eliminate the pervasive influence of misinformation on reasoning, a phenomenon known as the “continued influence effect” (Schmid, 2025). This persistence occurs because the misinformation may remain more familiar or easier to retrieve than the correction, or because the correction leaves a gap in the individual's mental model, making it harder to retrieve the correction from memory (Ecker et al., 2022 and American Psychological Association, 2023). To mitigate this, effective debunking should go beyond simply saying “that statement or claim is false” and should instead provide a detailed alternative explanation that fills the gap left by the corrected information (Van der Linden, 2022).
To maximize the impact of corrections, experts often recommend the “fact sandwich” structure (Fact-Myth-Fallacy-Fact), which sandwiches the misinformation between an opening and closing statement of the fact (Lewandowsky et al., 2020 and Schmid, 2025). This structure is designed to make the truth more salient while avoiding the “illusory truth effect,” where repeating a myth makes it seem more true due to repeated exposure and familiarity, even if they initially knew it was false (Lewandowsky et al., 2020). While earlier research raised concerns that corrections might inadvertently strengthen misconceptions (the “backfire effect”), recent consensus indicates that this is less likely than once thought, suggesting that practitioners should not avoid debunking out of fear of making misinformation worse (Lewandowsky et al., 2020; Simchon et al., 2025; American Psychological Association, 2023; Van der Linden, 2022; Morgan et al., 2025). Furthermore, recent literature suggests that while detailed refutations are important, the specific ordering of the fact sandwich components may not be essential for their effectiveness (Swire-Thompson et al., 2025).
When comparing interventions, the consensus is that “prevention is better than cure,” with prebunking (psychological inoculation) often yielding larger effect sizes than debunking (O’Mahoney et al., 2023 and Van der Linden 2022). While prebunking is highly effective for building resistance, debunking still remains a necessary tool for addressing false beliefs that have already taken root, particularly when delivered by trusted sources (Schmid, 2025). Altogether, in most circumstances, it is likely a good idea to debunk false information, especially if it has reached a large audience and the corrected information can be shared by trusted sources. There are certainly some cases where it is best to stay silent to avoid bringing attention to the misinformation (e.g., social media trolls and bots with a small reach), but more evidence-based voices sharing accurate information will help us win the (mis)information war.
2.3 – Multi-Modal Literacies as Defense Against Misinformation
The distinctions between health, media, and digital literacies are blurred, yet each plays a vital role in equipping individuals to navigate the modern information ecosystem (American Psychological Association, 2023). Broadly speaking, information literacy is defined as the skills necessary to navigate complex information environments, with other, more specific literacies
falling underneath it (Syracuse University, 2025). Health literacy is defined as the capacity to access, process, and understand basic health information required to make appropriate decisions (Institute of Medicine, 2004). Media literacy is defined as the ability to evaluate media messages across print and online formats, while digital literacy is specifically defined as the skills required to execute tasks online (Syracuse University, 2025). These literacies are complemented by social media literacy (SML), which is the ability to critically assess the content consumed or produced on digital platforms, helping protect against the rapid dissemination of false information (Ziapour et al., 2024). Finally, scientific literacy encompasses a grasp of scientific terminology and concepts, an understanding of scientific inquiry, and an awareness of the interactions between science and other domains (Mellberg et al., 2025). Importantly, it extends beyond content knowledge to include an understanding of the nature of science (NOS) – the epistemological foundations of how scientific knowledge is produced, validated, and self-corrected through peer review and expert consensus. Scientific literacy aims to produce a competent “outsider” who can discern credible sources and information and understand the structure of reliable information without needing to be an expert in specific fields (Mellberg et al., 2025).
In the existing literature, there are several suggested strategies for improving these literacies and solid evidence for how each one can help to prevent the spread of misinformation. While I will not go into detail on all of these, I would like to discuss one vital digital literacy skill below: lateral reading. An approach used by professional fact checkers, lateral reading is the act of leaving an unfamiliar website to search for information about the source’s credibility elsewhere (Guess et al., 2020). In a study by Fendt et al., lateral reading training led to significant improvements in participants' ability to identify fake news (2023). These findings suggest that even brief instruction in lateral reading can meaningfully strengthen people’s ability to evaluate online information, making it a practical and scalable digital literal skill for countering misinformation that could be applied in various ways.
Ultimately, these literacies and skills may function as solid determinants of health-related behavior, including belief in nutrition-related misinformation. Low health literacy has been linked to poor adherence to preventive measures, such as vaccination (Saleem and Jan, 2024). Conversely, higher SML empowers individuals to critically analyze media messages and make better health decisions, making them less likely to fall for health-related misinformation (Ziapour, 2024). It is recommended that literacy interventions be viewed as “boosters” that build long-term psychological resilience (American Psychological Association, 2023). And to be effective at scale, these competencies must be integrated into national policies and school curricula to build a health-literate society capable of managing the risks of an infodemic (American Psychological Association, 2023; Salem and Jan, 2024). There is already a substantial body of work demonstrating how media literacy can be effectively embedded into school curricula. For example, Chris Sperry’s work with Project Look Sharp illustrates how critical media analysis can be integrated across subjects through a methodology called Constructivist Media Decoding (CMD), providing a practical model for cultivating these skills early on (Project Look Sharp, n.d.). This will be discussed in more detail in Part 3.
3 – Conclusion and Looking Forward to Part Three
A growing body of empirical research demonstrates that misinformation is not simply a product of ignorance, but a predictable outcome of how humans process information in an environment characterized by cognitive limitations, emotional cues, and information overload. As reviewed in this article, strategies such as prebunking, debunking, and the development of media, health, and scientific literacies can meaningfully reduce susceptibility to false claims, limiting their spread. These approaches work by strengthening our cognitive defenses – enhancing our ability to recognize manipulation, misinformation tactics, and evaluate claims – rather than by attempting to correct every false claim or become subject matter experts.
Importantly, these strategies do not rely on eliminating our many human biases entirely. We are subject to dozens of cognitive biases and heuristics (mental shortcuts) that greatly influence information processing, belief formation, and information sharing, which I did not discuss here. Approaches such as prebunking and increasing various literacies aim to reduce misinformation spread by improving our awareness, skepticism, and analytical skills. Readers interested in cognitive biases and logical fallacies, or much more comprehensive sources on misinformation, are encouraged to consult the additional resources below.
Nutrition misinformation provides a particularly illustrative application of these cognitive defenses. Prebunking can help individuals recognize familiar messaging patterns in nutrition discourse, such as fearmongering about ingredients or the tendency to attribute complex health outcomes to single dietary components. Effective debunking, when paired with clear alternative explanations, can redirect attention away from exaggerated claims and myths. Additionally, media, health, and scientific literacy skills are essential for evaluating nutrition and health claims that often rely on selective or misleading interpretations of research findings.
While cognitive defenses are essential, they are not sufficient on their own. Building a health-literate public must occur alongside changes to the environments in which people learn, eat, and make health decisions. In Part 3 of this series, I will turn to the systemic factors that shape population health and examine evidence-based policies and interventions – including education, food systems, and public health infrastructure – that are necessary to improve population-level health.
Literature Cited
Almog Simchon, Tomer Zipori, Louis Teitelbaum, Stephan Lewandowsky, Sander van der Linden, A signal detection theory meta-analysis of psychological inoculation against misinformation, Current Opinion in Psychology, Volume 67, 2025, 102194, ISSN 2352-250X, https://doi.org/10.1016/j.copsyc.2025.102194
American Psychological Association. (2023). Using psychological science to understand and fight health misinformation. https://www.apa.org/pubs/reports/health-misinformation
A.M. Guess, M. Lerner, B. Lyons, J.M. Montgomery, B. Nyhan, J. Reifler, & N. Sircar, A digital media literacy intervention increases discernment between mainstream and false news in the United States and India, Proc. Natl. Acad. Sci. U.S.A. 117 (27) 15536-15545, https://doi.org/10.1073/pnas.1920498117 (2020).
Basol, M., Roozenbeek, J., Berriche, M., Uenal, F., McClanahan, W. P., & Linden, S. van der. (2021). Towards psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation. Big Data & Society, 8(1). https://doi.org/10.1177/20539517211013868 (Original work published 2021)
Facciani, M. (2025). Misguided: Where Misinformation Starts, How It Spreads, and What to Do About It. United States: Columbia University Press.
Fendt, M., Nistor, N., Scheibenzuber, C., & Artmann, B. (2023). Sourcing against misinformation: Effects of a scalable lateral reading training based on cognitive apprenticeship. Computers in Human Behavior, 146, 107820. https://doi.org/10.1016/j.chb.2023.107820
Institute of Medicine. 2004. Health Literacy: A Prescription to End Confusion. Washington, DC: The National Academies Press.
Jennifer C. Morgan, Melanie L. Kornides, Jeremy Lee, Jessica Fishman, Different vaccination debunking interventions: a randomized, controlled experiment estimating “backfiring” and positive effects, Vaccine, Volume 62, 2025, 127463, ISSN 0264-410X, https://doi.org/10.1016/j.vaccine.2025.127463.
Mellberg, S., Danielsson Friberg, A., & Nygren, T. (2025). Science education against misinformation: an educational intervention in upper-secondary schools. International Journal of Science Education, 1–33. https://doi.org/10.1080/09500693.2025.2571131
O'Mahony, C., Brassil, M., Murphy, G., & Linehan, C. (2023). The efficacy of interventions in reducing belief in conspiracy theories: A systematic review. PloS one, 18(4), e0280902. https://doi.org/10.1371/journal.pone.0280902
Philipp Schmid, Debunking health misinformation with empathy, Current Opinion in Psychology, Volume 67, 2025, 102213, ISSN 2352-250X, https://doi.org/10.1016/j.copsyc.2025.102213.
Project Look Sharp. (n.d.). Retrieved December 22, 2025, from https://www.projectlooksharp.org/
Roozenbeek, J., Van der Linden, S., Inoculation Theory and Misinformation (2021). Riga: NATO Strategic Communications Centre of Excellence.
Roozenbeek, J., Linden, S. van der, & Nygren, T. (2020). Prebunking interventions based on “inoculation” theory can reduce susceptibility to misinformation across cultures. Harvard Kennedy School Misinformation Review, 1(2). https://doi.org/10.37016//mr-2020-008
Roozenbeek J, van der Linden S. How to Combat Health Misinformation: A Psychological Approach. American Journal of Health Promotion. 2022;36(3):569-575. doi:10.1177/08901171211070958
Saleem SM and Jan SS (2024) Navigating the infodemic: strategies and policies for promoting health literacy and effective communication. Front. Public Health 11:1324330. doi: 10.3389/fpubh.2023.1324330
Southwell, B. G., Brennen, J. S. B., Paquin, R., Boudewyns, V., & Zeng, J. (2022). Defining and Measuring Scientific Misinformation. The ANNALS of the American Academy of Political and Social Science, 700(1), 98-111. https://doi.org/10.1177/00027162221084709 (Original work published 2022)
Stephan Lewandowsky, John Cook, Ullrich Ecker, Dolores Albarracin, Michelle Amazeen, P. Kendou, D. Lombardi, E. Newman, G. Pennycook, E. Porter, D. Rand, D. Rapp, J. Reifler, J. Roozenbeek, P. Schmid, C. Seifert, G. Sinatra, B. Swire-Thompson, S. van der Linden, E. Vraga, T. Wood, M. Zaragoza. "The Debunking Handbook 2020." https://doi.org/10.17910/b7.1182 https://hdl.handle.net/2144/43031 "Downloaded from OpenBU. Boston University's institutional repository."
Swire-Thompson, B., Butler, L., & Rapp, D. N. (2025). The truth sandwich format does not enhance the correction of misinformation.
What Is Information Literacy and Why Does It Matter? (2025, September 22). https://ischool.syracuse.edu/what-is-information-literacy/
William J. McGuire, Some Contemporary Approaches 1The research by the author which is reported in this chapter was greatly facilitated by two successive grants from the National Science Foundation, Division of Social Sciences. He wishes to acknowledge the aid itself and the enlightened manner in which it was administered by the Foundation., Editor(s): Leonard Berkowitz, Advances in Experimental Social Psychology, Academic Press, Volume 1, 1964, Pages 191-229, ISSN 0065-2601, ISBN 9780120152018, https://doi.org/10.1016/S0065-2601(08)60052-0.
Ziapour A, Malekzadeh R, Darabi F, YıldırımM, Montazeri N, Kianipour N and Nejhaddadgar N (2024) The role of social media literacy in infodemic management: a systematic review. Front. Digit. Health 6:1277499. doi: 10.3389/fdgth.2024.1277499
Recommended Further Reading
American Psychological Association. (2023). Using psychological science to understand and fight health misinformation. https://www.apa.org/pubs/reports/health-misinformation
Facciani, M. (2025). Misguided: Where Misinformation Starts, How It Spreads, and What to Do About It. United States: Columbia University Press. (Book)
Soprano, M., Roitero, K., La Barbera, D., Ceolin, D., Spina, D., Demartini, G., & Mizzaro, S. (2024). Cognitive biases in fact-checking and their countermeasures: A review. Information Processing & Management, 61(3), 103672. https://www.sciencedirect.com/science/article/pii/S0306457324000323
van der Linden, S. (2023). Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity. United States: W. W. Norton. (Book)
https://thinkingispower.com/science-and-its-pretenders-pseudoscience-and-science-denial/

Tyler Morris is a Nutrition Science student on the pre-med track at Indiana University–Bloomington, where he is also minoring in Chemistry and serving as president of the School of Public Health–Bloomington Honors Program. His academic and professional interests include how health misinformation spreads, the psychology behind why people believe it, and strategies to strengthen critical thinking and media literacy. He plans to pursue a career in medicine and research while remaining actively engaged in science advocacy and communication.