So what’s being done about fake news?

Dr Paul Bernel (University of East Anglia) evaluates some of the features Google, Facebook and Wikipedia have added to tackle the issue of fake news.  

More

Fake news has become big news over the past few months. And that’s mainly due to the election of Donald Trump, who is famously given to accusing his critics of making up stories about him, but who is himself arguably the biggest purveyor of misinformation ever to occupy the White House.

The fake news juggernaut rolls on, mainly on social media, polarising opinion and sowing division in its wake. Facing a barrage of criticism from both politicians and the “traditional” media, some of the biggest internet companies have come up with their own plans to address the problem.

Google has expanded its “fact check” system, which “identifies articles that include information fact checked by news publishers and fact-checking organisations”, putting a tag on them to confirm this. Facebook has placed ads in major newspapers, purged “tens of thousands of fake accounts” and built an algorithm to “automatically spot fake news”.

Facebook icon replicated over and over again

Meanwhile, Wikipedia has launched its own counter to fake news, WikiTribune, with bold claims: “The news is broken and we can fix it.” Like Wikipedia itself, WikiTribune works by crowdsourcing – building a community of journalists and volunteers, in a similar manner to Wikipedia’s volunteer editors.

The two different approaches exemplify the ways that Google and Facebook differ from Wikipedia – and reflect two very different philosophies about the internet. Google and Facebook use the idea that computers (and the algorithms through which they function) are neutral and unbiased, an idea that allows them (to a certain extent) to avoid taking much responsibility for the content that they link to or hold. Wikipedia enlists the support of “good” people, who it believes can be unbiased and “neutral” (the “neutral point of view” is one of the “Five Pillars” that underpin Wikipedia).

But neither of these is really true. There is increasing evidence that, rather than eliminating or counterbalancing prejudices, algorithms can embed, exacerbate and exaggerate them. Wikipedia’s “neutral point of view” is not only theoretically questionable, but has been demonstrated empirically not to be true – for example in relation to sexism (it’s well known that the vast majority of Wikipedia editors are men and that this is reflected in the way gender is often portrayed on Wikipedia).

Facebook experienced both problems in one episode in August 2016. When it was revealed that Facebook’s trending news module, rather than being purely algorithmic as many had believed, was “curated and tweaked” by humans, there was outrage, particularly from US conservatives, who accused Facebook of applying a liberal bias.

Facebook reacted by firing its editorial team with the aim of producing a more neutral, algorithmic system. The result was close to farcical as false and sometimes ridiculous stories were promoted by the algorithm. Which was worse? Biased human involvement or the algorithms? Both have significant problems – and these problems are both theoretical and practical.

To add to this mess is another dimension. Obviously “fake” news, false stories, deliberately created to misinform, are only part of the problem. Another is a more familiar tactic for the “traditional” media: using verifiable facts to create a fanciful narrative. The creation of many anti-immigration narratives – that “health tourism” and “benefit tourism” are significant problems, for example – use real information taken out of context and with statistics manipulated to produce a story that is essentially false.

In the US, the negative reports about Hillary Clinton’s emails were based on a real investigation, but the stories that developed around them and the narrative into which they were woven was something quite different.

No easy fix

The first thing that we need to be clear about is that there is no easy solution to this. Claims like that of WikiTribune that they will “fix” the problem are far from the truth. To find a way forward we need to dig a little deeper. Part of the problem is the growing public dissatisfaction with the mainstream media – and the media itself has to take some of the blame for that. If these outlets are not seen to hold politicians and others properly to account – but instead appear to help them weave false narratives and get away with lies and manipulations, it is hard for people to trust them. That in turn leaves space for fake news to fill, even if that then perpetuates the problem.

News journalist speaking to a camera

The fake narratives are also, however, a clue to the first part of the way forward: better “conventional” journalism. Around the Brexit referendum and the US presidential election, significant parts of the media failed to sufficiently hold politicians to account, to expose lies and manipulations. This needs to change – and the general election campaign currently underway in the UK is a big test of whether it can.

The second even more important – and harder – part of the solution is to reduce the dependence on Facebook in particular for news. This works both ways: journalists and newspapers should resist using Facebook as a platform, and people need to be weaned away from using Facebook as a way to find out news. No amount of tweaking of algorithms or purging of fake accounts can counter the fact that platforms such as Facebook are tailor made for the sharing of information – and misinformation. And the purveyors of fake news are experts at gaming algorithms, so whatever Facebook does, they will find a way around it. The only real solution is to stop playing the game.

That’s easier said than done – these social media platforms have become so much a part of so many people’s lives. The fact that we are becoming more aware of the problem with fake news is at least a start – but little more than that. It is not a problem that is going away anytime soon.

This article was originally published on The Conversation. Read the original article.

The Conversation