New Anti-Fake News Strategy Is Not Going To Work

Are you familiar with some new tips to spot fake news on Facebook? The social media company was investigate for spreading fake news (propaganda) during the US presidential election. It is clear that spreading fake stories about politicians trafficking child slaves or launching terrorist attacks without any consequences for democracy and society is a bad idea.

It was necessary to do something. Facebook’s depressingly incompetent strategy to combat fake news is now available. It has three parts, which frustratingly ineptly thought out.

New products

The plan’s first component is to create new products that will stop fake news stories spreading. Facebook claims it is trying to make it easier for people to report fake news stories and to identify signs such as if reading an item makes them significantly less likely to share it.

The story will sent to independent fact-checkers. Fake stories will flag and linked to the correct article. It sounds great, but it will not work.

Non-experts would not be able to tell the difference between fake news and real news, so there wouldn’t be any fake news problem.

Facebook also stated that they cannot be arbiters for truth themselves, given their scale and our roles.

Facebook functions as a megaphone. Normaly, the megaphone company is not responsible for someone saying something terrible into it. Facebook, however, is a special megaphone that listens and changes the volume.

Your newsfeed’s content and order are largely determined by the company’s algorithms. If Facebook’s algorithms spread hate speech from neo-Nazis all over the internet, then it’s their fault.

Worse, even if Facebook correctly labels fake news as contested it will still impact public discourse via availability cascades.

The message becomes more plausible and reasonable each time it is repeated from different sources. Bold lies can be very powerful as people are able to recall them being true by repeatedly fact-checking them.

These effects are extremely strong and cannot be fixed by weak interventions like public service announcements. This brings us to the second part Facebook’s strategy, helping people make better informed decisions when they come across false news.

We Can Help You To Help Yourself

Facebook will release public service announcements, and it will fund the news integrity initiative in order to assist people making informed judgments about what they read and share online. This is also not a good idea.

Cognitive psychology research has produced a lot of evidence about correcting systemic errors in reasoning, such as the inability to recognize propaganda or bias. Since the 1980s, we know that simply warning people about their biases does not work.

Similar to the above, it sounds great to fund a news integrity project until you realize that the company is actually talking about critical thinking skills.

Primary, secondary, and tertiary education all aim to improve critical thinking skills. What good will four years of university do for these skills? You can make some YouTube videos. Fake news FAQ. It is unlikely that funding a few research projects or meetings to industry experts will make a difference.

New Economic Incentives Are Disrupt

This non-strategy’s third prong is to crack down on spammers, fake accounts, as well as making it more difficult for them to purchase advertisements. This is a great idea but it’s false because most fake news comes from con artists and not major news outlets.

Fake news is Orwellian newspeak. It refers to a completely fabricate story that has been sourced from an unknown outlet and marketed as news for political or financial gain. These stories are most suspect and thus the least concerning. More insidious are the lies and bias from public officials, official reports, and mainstream news.