Archiving notice

Since their introduction in early 2020, COVID-19 vaccines have saved at least 1.4 million lives in Europe alone. Worldwide, more than 12 billion doses of the vaccines have been administered as of March 2024, and between 14 and 20 million lives have been saved. In light of this success and the generally high vaccination uptake in most countries around the world, we have decided to discontinue updating of this wiki. The existing material may be of interest for archival reasons and will remain accessible, but not further updates will be undertaken. If you have any questions about the wiki or the Handbook, please contact the authors.

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →
Return to COVID-19 Vaccine Communication Handbook & Wiki entry page

Note: text in green identifies latest update. (See last updated time stamp at top of page.)

Nudging

Flattening the curve of the "infodemic"

Although it is possible to correct misinformation, as we have shown in the COVID-19 Vaccine Communication Handbook, the Debunking Handbook 2020, and the Conspiracy Theory Handbook, this may not be sufficient to stop disinformation from gathering speed, particularly if it is disseminated by highly motivated people for political reasons. (Our page on the politics of COVID-19 disinformation explains this is in more detail.)

We must therefore look for additional ways in which we can “flatten the curve of the infodemic, so that bad information can’t spread as far and as fast” (Ball and Maxmen, 2020). Two of those options are nudges and boosts.

What are nudges?

Nudges are ways to use the context in which a decision takes place to systematically affect these decisions. For example, employees at Google have access to free food and drink at work. Healthy options, such as sparkling water, are more visible and easier to access than less healthy options, such as sugary drinks. Together with a number of other nudges, Google successfully shifted employees' consumption towards healthier options.

Many nudges have been successfully explored in a variety of settings.

What are boosts?

Boosts resemble nudges in that they have the same goal of systematically affecting decisions. However, unlike nudges, boosts seek to “improve” decisions by reminding people of existing knowledge or beliefs, and encouraging them to use that knowledge in a specific context. For example, a "boost" may be as simple as telling people that they should immediately call an ambulance when they suspect someone has a heart attack-this is not obvious to many people (Grüne-Yanoff & Hertwig, 2016)!

Are boosts and nudges manipulative?

Both approaches respect behavioural autonomy. They neither incentivize nor limit choice options: people are still free to choose. Specifically, with respect to fighting misinformation, the choice environment can be designed in a way that facilitates proper identification of and interaction with information (without limiting access to information or the right to free speech and media). Such nudges can increase resilience towards misinformation and empower consumers to spot and counter misinformation. One example is Facebook's recent introduction of semi-opaque masks that are overlaid onto material identified as being false: these masks inhibit but do not preclude access to questionable material. According to Facebook, 95% of people do not view the questionable content if they are warned. This is a significant reduction of misinformation compared to when no warning is presented.

How can nudges and boosts be applied to misinformation?

Recent research provides evidence that people may fail to even consider whether news content is accurate before they share it. In consequence, they may share attention-grabbing but false information (e.g., Pennycook et al., 2020a).

One approach, therefore, is to subtly prompt or nudge people to consider accuracy before sharing content on social media-thereby increasing the salience of truth to change the way that people interact with social media. This approach has been shown to increase the quality of news content that people intend to share on social media, both in the context of COVID-19 misinformation (Pennycook et al., 2020b) and political misinformation (Pennycook et a., 2020a).

A similar approach asks people to commit to a Pro-Truth Pledge of verifying information before sharing and correcting misinformation. Signatories pledge their "Earnest Efforts To: Share truth, Honor truth, and Encourage truth". It has been found that people who took the pledge posted and shared more accurate news stories on Facebook (Tsipursky et al., 2018).

Would you like to find out more about nudging? We created a search query specifically for this page, which links you to other interesting resources like Twitter threads, blogposts, websites, videos and more. Check out the search query that we generated specifically for this page here.

Would you like to know more about how we generated the search queries and how our underlying knowledge base works? Click here to learn more.


Page contributors: Stephan Lewandowsky, Dawn Holford

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →
Return to COVID-19 Vaccine Communication Handbook & Wiki entry page

Note: text in green identifies latest update. (See last updated time stamp at top of page.)

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 964728 (JITSUVAX).

Have feedback? Want to contribute? Email us or leave us a comment.

Loading embed note