The global citizens movement Avaaz install life-sized Zuckerberg cutout figures wearing ‘Fix FaKebook’ t-shirts, in a protest action in front of the European Commission in Brussels, Belgium, 22 May 2018. EPA-EFE/STEPHANIE LECOCQ

Fake news covers a multitude of sins. Despite the fact it has become a popular phrase – not least from certain world leaders – there is little clarity over what it actually is. Without a sound definition, however, any solution will be misguided. I’d like to suggest there are in fact three main categories of ‘fake news’, and each demands a different response. Some of it can be tackled with technology – but mostly it requires a new approach to media literacy.

First, there is monetised fabrication. This is people knowingly publishing falsehoods, simply as a means to make money via online advertising revenue. This form of fake news has been largely incentivised by programmatic advertising – where digital adverts follow users around and advertise on sites they visit, irrespective of how truthful it is. And because we like clicking on outrageous headlines, false stories make money. Here, tech can play a role. For example Facebook and Google can de-prioritise or demonetise sites or content designed on this model. (And there are encouraging signs that big technology firms are doing exactly this).

This is the easiest form of fake news to deal with. The second form is more tricky: knowingly false or misleading propaganda, pushed for a specific political purpose. We’re always had this. The great Benjamin Franklin stirred it up against the Brits during the American Revolution, sharing made-up stories about George III to better rile supporters. This is harder because it plays on human bias and gullibility. It’s also about to get a turbo-charge with the advent of powerful ‘deep fake’ technology, which will make it very easy to produce believable videos or audio of anyone saying anything.

True, technology can play a small role here: investment in systems that automatically spot and identify deep fake files will be critical, and probably require some form of partnership between governments and private companies, in a similar way to existing technology designed to automatically spot illegal images of children. It’s this type of fake news where I see fact checkers playing a role. The hardliner won’t change his mind with ‘facts’ – they rarely do – but the fence-sitters might.

But this category of falsehoods blurs with the final and most intransigent form of fake news: the highly partisan, one-sided, biased news. Sometimes this is unintentional, sometimes it’s editorial. Sometimes it’s political pundits accusing opponents of peddling lies. In reality, this is where most of our ‘fake news’ sits. It’s the murky, complicated grey area of contested facts, assertions, rhetoric, and exaggeration. This is nothing new either, but digital technology has made it easier than ever for citizens to curate their own information worlds, through conscious or accidental online choices. What’s especially difficult about this is that it doesn’t even require straight falsehoods that populate my first two categories. It is now easier than ever to surround yourself with carefully chosen, cherry picked truths, but which are one-sided and highly misleading when added up. If you string together enough misleading truths, you end up with a coherent but very misleading personal reality. There is no technological or platform or policy solution to this problem – and we certainly don’t want governments stepping it to stipulate what’s true and what’s not.

It is this final category where expanded media literacy becomes so vital. Picking through the grey area of soft bias, misleading headlines and soft propaganda is perhaps the single most important thing kids must learn at school nowadays.  But it is not simply a technology issue. Nor is the answer to simply ‘question everything’ – that’s the world of conspiracy thinking. Instead it’s to develop a theory of epistemology: ‘why should I trust one thing over another thing’?

To answer that question today – and therefore the basis of what I call expanded media literacy – requires both technology and philosophy. Yes, it should of course include tech: everyone needs to know roughly how algorithms, video splicing, IP spoofing and deep fakes work. Everyone needs to know about what happens to their data, the business model of social media platforms, and the responsibilities of publishing. But to that must be added non-technical considerations. The modern citizen is expected to sift through an insane torrent of competing facts, networks, friend requests, claims, blogs, data, propaganda, misinformation, investigative journalism, charts, different charts, commentary and reportage. This is confusing and stressful, and so we lean on easy and simple emotional heuristics to make sense of the noise. We rely on ‘confirmation bias’ – reading things we already agree with, surrounding ourselves with similar people and avoiding information that does not conform to our pre-existing view of the world. Similarly, because there is so much noise out there, studies repeatedly find that emotional content is more likely to get traction online – shares, retweets, etc – than serious and thoughtful comment and stories. Media literacy which doesn’t cover our own biases, our own irrationalities, our own psychological weaknesses will fail, since these are more powerful than any algorithm or newsfeed.

Even then, we’ll probably never ‘solve’ this problem. In fact, solving fake news would be very dangerous, since it would eliminate too much of the grey area that makes up politics. As long as humans are humans, and as long as we live in a free society, there will be fake news – and probably all three shades I’ve described. But with a new approach to media literacy we might mitigate its worst effects and learn to live with it.

Jamie Bartlett is a British author and journalist, primarily for The Spectator and The Telegraph. He is also the Director of the Centre for the Analysis of Social Media for Demos in conjunction with The University of Sussex. His latest book is “The People Vs Tech: how the internet is killing democracy (Ebury, 2018)”.