Disinformation and the challenge to democracy

With the spread of disinformation and misinformation accelerating through social media channels, sociologist of science Dr Darrin Durant explains why disinformation threatens democracy, but misinformation is unavoidable.

Democracy is a two-sided beast. It wants to make the right decisions, but it also wants to have the right process in getting to that decision.

The processes for deliberation are often about opening up issues, whereas decisions by necessity are about closing down issues.

Disinformation sits right at the nexus of deliberation and decision-making. It does something to the deliberative process, and that affects the end result.

But you simply cannot isolate decision-making from disinformation or misinformation practices; that’s not how democracy works.

Disinformation is a very old story. At the same time, it has become something new.

We have seen its practices before. We’ve seen lies, deception and nonsense. We’ve seen systems of discussion living in their own filter bubbles. That part is old.

What is new is the intensification and expansion of disinformation. Its tentacles reach into so many different domains and contemporary practices: how the media reports, how courts function, how governments make decisions and communicate them to us, and our private deliberations.

My research is about expertise as part of the knowledge base used in the democratic deliberative process, helping to reach the ‘right’ decisions. Expertise is part of the checks and balances of democracy, and it helps to legitimise decisions.

What is crucial to the democratic process is a clear distinction between disinformation and misinformation. Without it, you’re going to undermine the very democracy you think you’re defending.

Disinformation is strategic. It is tactical and intentional. You have a good idea of what you’re doing. You may not be able to predict all the outcomes, but you operate with an idea about what your audiences think and how they might respond.

It includes deliberately unclear messages, ‘merchant of doubt’ strategies, omission, strategic manipulation and gaslighting; intentionally throwing so much mud against the wall that any reasonable person is confused about what to think or whether they understand anything at all.

A jumble of letters and words, with 'Fake news' appearing in the centre.

Misinformation is when information is sent out with no deliberate intention of getting it wrong. You cannot necessarily trace it back to intentional deception or wrongdoing. You may be able to track it to particular ways of selecting information, and you may or may not agree with those.

It is analytically challenging to distinguish between information that is recognised in retrospect as being wrong, and what is a legitimate difference of opinion.

However, disagreement is an integral part of the politicised process in the first place. It doesn’t mean it’s not resolvable, but it does mean we should allow wiggle room for contention and contestation about misinformation.

If we’re too stringent about what we consider to be misinformation, what we’ll actually do is cut off legitimate dissent.

Ultimately, we want to distinguish between discussion and a democratic policy that allows for contestation and which aspires to work out a reasonable and just solution for contending interests and values, versus discussion and democratic policy that is about power politics and calculated manipulation.

We know that disinformation practices are occurring online, within the technologies that operate social media, and we shouldn’t think of these technologies as neutral mediums.

The algorithms behind social media embody all the values and prior decisions that go into creating those platforms and algorithms. Those decisions might be commercial, political or based on some other value, or all of those things mixed up.

These algorithms, they enable certain things and make them easy, while they disable other things and make them hard, which shapes how associated social practices unfold.

For instance, anonymity makes it easy to express extreme opinions, to say something insulting and mean. If it’s online and you’ve hurt someone, maybe you don’t even realise it, or don’t care. Anonymity contributes to the polarisation of views and disconnection from impacts.

Live interactions are fundamentally different. Derisive or joyful expressions provoke visible reactions, allowing speaker and listener to moderate themselves. Seeking commonality across differences is more likely.

In the same way, political extremism can be much more confronting when you meet it in person than it is online, where it can be packaged in a way that is more appealing.

For modern democracy, the reality is that a lot of discussion now happens online, and opinions are formed online. There are questions we need to answer about how the different social media platforms are shaping these discussions and how best to respond to the threat of disinformation on the internet, while still providing space for debate and disagreement.

Dr Darrin Durant is a lecturer in Science and Technology Studies, in the School of Historical and Philosophical Studies at the University of Melbourne. He is part of the University’s multi-disciplinary team researching ‘Information and Influence’.