Dr Aviv Barnoy (assistant professor in Digitalisation and Business at Erasmus School of History, Culture and Communication) and his fellow researchers have discovered that clear agreements on how people should share information online can significantly reduce the spread of misinformation.
Concerns about misinformation on social media have led platforms and policymakers to experiment with warnings, labels and other countermeasures. In a new open access article in New Media & Society, Dr Barnoy and his colleagues focus on the epistemic norms that guide how people think they ought to share information online – norms about truth, evidence and responsible sharing. They argue that these norms should be at the centre of efforts to understand and limit the spread of misinformation on social media.
Based on scientific theory, the authors explain that sharing information online is a relatively new way of communicating (speech act) and that the norms surrounding it are still far from established or clear. To investigate how social media users think about false content, Barnoy first conducted reconstruction interviews with Facebook users who had previously shared misleading messages. It turned out that people often rely on recurring excuses and justifications for their messages – such as “I'm not a journalist, I can’t check everything” or claims that a post “can’t do any harm” or “represents a greater truth” – and also describe how quickly and automatically they sometimes share content.
These insights formed the basis for the second part of the study, which consisted of two survey experiments. Participants were randomly shown short messages that either contradicted or supported these excuses and justifications, or a simple label reminding them that the post contained scientific or medical information. They then had to decide whether to share a message that was either true or false. This mixed approach allowed the researchers to see not only whether these interventions reduced sharing, but also whether they helped people distinguish between true and false content – something that had been overlooked in many previous studies.
The results show that norms about responsible sharing are indeed important, but that different interventions work in different ways. Some of the more elaborate reminders made people less willing to share both true and false content. By contrast, a simple label shown directly on the post at the moment of decision selectively reduced the sharing of misleading information, while having little effect on the sharing of reliable messages. The researchers therefore emphasise the importance of timing: reminders are most effective when they are displayed at the moment someone decides to post something.
Dr Barnoy calls for further research into how best to design and manage these types of labels or prompts, and who determines which norms are leading. The research shows that combating misinformation is not only a technical issue, but also revolves around the social norms we as a society apply to responsible online communication.
- Researcher
- Related content
