Search for solutions to stop the spread of fake news

What if the people who have been sharing fake news could help solve the problem of its continuing spread? That is the proposition that underpins new research led by anthropologist Dr Fabio Mattioli at the University of Melbourne.

It builds on his previous research, which demonstrates that many of the people spreading fake news do not actually have any vested interests in the political outcomes of misinformation.

Dr Mattioli says questions about the spread of propaganda, misinformation and fake news are typically approached from a national security and “malicious agent” perspective. But there are many people who share such information through social media networks without any specific intent to cause harm; they are not “perpetrators” .

He has been researching the sharing of fake news since the 2016 US Presidential election campaign. At the time he was a postdoctoral researcher at New York University, where he had moved after completing a PhD at the City University of New York and teaching at the John Jay College of Criminal Justice.

Hands holding a phone showing Facebook

The trigger for his interest was a proliferation of pro-Donald Trump posts on Facebook by Macedonian teenagers, which made global headlines after an article on Buzzfeed went viral and was featured on the late night US talk show Last Week Tonight with John Oliver. Dr Mattioli was already researching Macedonian culture and economic recovery when he saw the Facebook post.

Macedonia is not known for a hacker culture, and the country has little interest in the outcome of the US election. So I wondered how do some people become so involved with fake news, and what is fake news for them?

Income, he discovered, was the answer, and not as paid agents of a disinformation campaign. One way to earn income is to share sensationalised or polarising content – click bait – that draws eyes and opens related advertising.

Without necessarily knowing anything about programming code, they are gaming the algorithm. Macedonians who shared fake news started to figure out the rules of the social media algorithms, trying to work out how Facebook ‘thinks’, Dr Mattioli says.

They are trying to make a living out of social media in a highly impoverished and inequal society with few economic opportunities. But in a way, they expose and embody the vulnerabilities and political issues that exist around social media, such as the fact that they tend to polarise information, or that polarised information is shared more.

The Macedonian interlocutors I interviewed were not malicious actors, or perpetrators; they’ve been caught up in a misinformation campaign for completely different reasons, he says. And understanding the specificity of who shares or produces misinformation is key to build an effective and proportionate response.

Somebody who wants to empower Trump and affect US policy, compared to somebody who shares posts because they’re bored, or for economic reasons – these are quite different scenarios in terms of the challenge to national sovereignty. And they require different responses.

His new project is investigating whether the same people who are sharing fake or polarised news could potentially become part of the solution.

At the moment, they are caught up in a system that makes them affect our democratic life in a negative way. But perhaps we can build a different kind of social media architecture that enables them to make money by creating different sorts of sharing tools so that instead of spreading fake news or exaggerated claims, they are doing the opposite.

He is also interested in the ethical aspects of this kind of research, including government responses to the issues.

Censorship or regulation that affects what can be posted potentially eliminates what are currently legitimate income streams. The research often involves scrapes of published content from social media channels without the explicit consent of individual account holders.

These are the kind of issues explored by the University of Melbourne’s Centre for Artificial Intelligence and Digital Ethics (CAIDE), of which Dr Mattioli is a member.

He says there are also questions around the consequences of taking no action. Maybe you share fake news as a joke or without thinking seriously about it. But maybe, as a result, another person who sees it takes a weapon and hurts somebody.

This is such an ethically complicated space. By humanising the people who are involved in the process though, we can get a better, more nuanced understanding of the implications of both the disinformation activities and our responses to them, Dr Mattioli says.


Dr Fabio Mattioli is a lecturer in Social Anthropology at the University of Melbourne’s School of Social and Political Sciences. He is an Associated Researcher at the Centre for Artificial Intelligence and Digital Ethics and part of the University’s multidisciplinary team researching ‘Information and Influence’.