Prebunking: Staying ahead of the curve on misinformation

The spread of misinformation poses a significant challenge to societies worldwide. As well as debunking it, a new line of research looks at how we can prevent social media users from falling for misinformation in the first place: prebunking. Banner image: Shutterstock/Zhitkov Boris
Prebunking: Staying ahead of the curve on misinformation
Like
The OECD Forum Network is a space for experts and thought leaders—from around the world and all parts of society—to exchange expertise and perspectives across sectors. Opinions expressed do not necessarily represent the views of the OECD.

Building a resilient recovery: Emerging stronger from the COVID-19 pandemic




The spread of misinformation poses a significant challenge to societies worldwide. For example, malicious rumours are known to sometimes lead to deadly riots, and belief in misinformation has been associated with a reduced likelihood of getting  vaccinated (for example against COVID-19). Researchers have therefore looked for effective ways to counter people’s belief in and sharing of misinformation. In addition to debunking misinformation after it has spread, a new line of research looks at how we can prevent social media users from falling for misinformation in the first place: an approach known as prebunking.

Interventions against misinformation

In a recent paper, we reviewed the evidence behind interventions aimed at countering misinformation. We divide them into two broad categories: individual-level and system-level interventions.

Figure 1. Different types of misinformation interventions. Taken from Roozenbeek, Culloty and Suiter (2022).
Figure 1. Different types of misinformation interventions. Taken from Roozenbeek, Culloty and Suiter (2022).

These intervention categories operate alongside each other in a complex news environment, and each has its upsides and downsides. System-level interventions are generally the most impactful; for instance, Germany’s NetzDG (or Network Enforcement Act) obligates social media platforms to remove “clearly illegal” online content. However, while system-level interventions may be effective at curbing misinformation and hate speech, they also carry significant risks. The German law has been criticised for making social media companies the arbiters of what kinds of content are allowed and what should be removed. Balancing efficacy with democratic ideals remains an important challenge for system-level interventions.

While individual-level interventions are unlikely to do harm, it’s not always clear how well they work in the real world.

Individual-level interventions do not have this problem. For instance, informing someone that a news story is false doesn’t infringe on their right to express themselves. It’s also true that there is substantial demand among educators, activists, regulators and the general public for materials that help people navigate online spaces. On the other hand, such interventions are generally less impactful and yield a relatively small effect size (see, for example, the OECD’s recent replication of an “accuracy nudge" intervention). Perhaps more importantly, focusing on individual-level interventions may draw necessary attention away from system-level interventions (such as a critical look at social media platforms’ recommendation algorithms). While individual-level interventions are unlikely to do harm, it’s not always clear how well they work in the real world.

Read more on the Forum Network: The Digital Bazaar: How schools can help students handle understanding and ambiguity in the information age by Andreas Schleicher, Director, Education and Skills, OECD
Read more: The Digital Bazaar: How schools can help students handle understanding and ambiguity in the information age by Andreas Schleicher, Director, Education and Skills, OECD
Striking a balance between competing demands—equity and freedom, autonomy and community, innovation and continuity, efficiency and democratic process—hinges on 21st-century literacy skills. The good news is that the tools to develop these skills are ready.

Prebunking: reducing susceptibility to misinformation

As psychologists, our focus is nonetheless on developing individual-level interventions. A common approach to tackling misinformation at the individual level is debunking (or fact-checking). While we think this approach is worthwhile—and research generally shows that it’s effective at correcting misperceptions—there are also some limitations. For example, correcting a misperception doesn’t completely undo people’s memory of it, a phenomenon known as the “continued influence effect”. Another problem is that fact-checks often don’t reach the people who were exposed to the original misinformation.

It's therefore useful to also explore how people can be prevented from falling for misinformation in the first place. The umbrella term for such approaches is prebunking, or pre-emptive debunking. While there are many different methods of prebunking, our preferred approach is known as inoculation theory. Originally developed in the 1960s, inoculation theory posits that people can develop psychological resistance against future manipulation attempts by pre-emptively exposing them to a “weakened dose” of misinformation.

Although inoculation theory has been shown to generally be effective, one limitation has been its scalability. Inoculating people against every individual example of misinformation is not possible, because you can’t always know ahead of time what specific content people will be exposed to. To mitigate this problem, researchers instead decided to inoculate people against the techniques and tropes that underlie the misinformation we see online (such as emotional manipulation or conspiratorial reasoning). The advantage of this approach is that you can design interventions that apply to a much broader range of misinformation.

We developed a series of such technique-based inoculation interventions. These include games such as Bad News (where you play as the owner of a “fake news” website), Harmony Square (where you mount a disinformation campaign to drive apart a peaceful community), and Go Viral! (where you take a dive into a COVID-19 echo chamber). Researcher John Cook developed Cranky Uncle, a game where you learn about the various tactics used in climate misinformation.

Figure 2. Screenshots from the Bad News, Harmony Square, Go Viral! and Cranky Uncle games.

Figure 2. Screenshots from the Bad News, Harmony Square, Go Viral! and Cranky Uncle games.

However, although these games are generally quite effective at improving people’s ability to recognise misinformation for several weeks, they also have limited scalability. After all, not everyone is interested in playing a game. To address this problem, we partnered with Google Jigsaw to develop a series of short inoculation videos, each of which tackles a particular manipulation technique commonly encountered online: emotionally manipulative language; false dichotomies; incoherence; ad hominem attacks; and scapegoating.

In a recent study published in Science Advances, we tested how well these videos worked at improving misinformation resilience. We first conducted six large-sample lab studies, and found that the videos were effective at improving people’s ability to recognise manipulative content. However, efficacy in the lab doesn’t always translate to real-world effectiveness. We therefore ran two of the videos as an ad campaign on YouTube, and used YouTube’s BrandLift feature to ask people who had seen the video to identify manipulation techniques in news headlines. We found that people who had watched an inoculation video were 5-10% better at this than a control group. This is a highly encouraging finding, especially because YouTube users could skip the ad, turn off the volume, or switch tabs.

Tackling misinformation pre-emptively at scale appears to be feasible, including in real-world social media environments.

Despite these encouraging results, some limitations remain: the limited efficacy of individual-level interventions aside, we haven’t yet looked at whether prebunking misinformation has an influence on people’s behaviour (for example the sharing of misinformation with others). Nonetheless, tackling misinformation pre-emptively at scale appears to be feasible, including in real-world social media environments.

Visit our website to watch the videos, play the games and learn more about the research!




To learn more, listen also to the OECD podcast Is digital media literacy the answer to our disinformation woes? below

And register to the OECD Global Forum on Building Trust and Reinforcing Democracy, which will take place on 17-18 November 2022.

Please sign in

If you are a registered user on The OECD Forum Network, please sign in