Info interventions
: A set of approaches, informed by behavioural science research and validated by digital experiments, to build resilience to online harms.
Accuracy prompts
Refocus user attention towards accuracyHypothesis
How it works
-
1
1. The individual scrolls through their social feed and comes across content with potential misinformation.
-
2
2. An accuracy prompt is triggered and pops up over the content.
-
3
3. A bite-sized explanation on why they are seeing the reminder is served to the individual and their attention is shifted to the accuracy of the content with information literacy tips.
-
4
4. The individual is now prompted to be more aware and may think twice when coming across similar content in their feed.
Findings
Examples
Online survey experiments were conducted across 16 countries with more than 30,000 participants, in which they were asked to rate their intention to share true and false news headlines.
Resources
Redirect Method
Interrupt online radicalisationHypothesis
How it works
-
1
1. The individual completes an online search using keywords that indicate an interest in extremist propaganda.
-
2
2. The Redirect Method is initiated and picks up on the keyword to prompt an intervention.
-
3
3. An ad is presented to the individual featuring more information on their topic of interest.
-
4
4. Upon clicking the ad, the individual is redirected to content that counters false extremist narratives.
Findings
Examples
The content was uploaded by users from all around the world to confront online radicalisation, and selected by expert practitioners.
Focusing on the slice of ISIS's audience most susceptible to its messaging, our methodology recognises that even content not created for the purpose of counter-messaging can still undermine harmful narratives when curated, organised and targeted effectively. Since 2016, Moonshot has partnered with an array of technology companies, including Facebook, to deploy advertising to those expressing an interest in other online harms, including white supremacy, violent misogyny and conspiracy theories.
Prebunking
Increase resistance to manipulationHypothesis
How it works
-
1
1. A prebunking video is served to a group of users as an ad in their social media feed.
-
2
2. Through short video messages the individual is informed of possible attempts to manipulate them online.
-
3
3. The individual is shown a relevant example of a manipulative technique or narrative and then given counter arguments to refute the claim.
-
4
4. By analysing how well video viewers recall the techniques in a short survey relative to a control group, we can assess their likelihood to resist manipulative content in the future.
Findings
-
73%
of individuals who watched a prebunking video were more likely to consistently spot misinformation online (source: Science advances)
-
5%
Prebunking videos as YouTube Ads boosted recognition of manipulation techniques by 5%. (Source: Science advances)
Approach and examples
Resources
Jigsaw is a unit within Google that explores threats to open societies and builds technology that inspires scalable solutions.