According to a BBC report, social networking website Facebook just tried to help combat fake news with a strange experiment that would promote sceptical comments on posts.
Facebook is certainly one of the largest platforms that disseminates a variety of fake news every single day, and many claim that the misinformation circulating on the site may have even had an effect on the 2016 US presidential election.
So it’s no wonder that Facebook has been attempting to implement a variety of tools to help combat the spread of such news, including partnering with a range of fact checking organisations to help the site label news that has been disputed.
However a recent test by Facebook may not have given the results they would have wanted. The test “promoted” comments containing the word ‘fake’, so they would appear above all other comments when made on a particular Facebook post.
Presumably the experiment – which was only available to a certain number of Facebook users – wanted to test the effect of highlighting comments that indicated disbelief to see if they would help other users determine the legitimacy of the post on which they were published.
However, Facebook users who were selected for the experiment soon complained that comments with the word fake would appear on most of the posts in their newsfeed, regardless of their legitimacy. These included posts from the BBC, The Economist and The New York Times.
Some users were clearly exasperated.
Can @facebook explain why the featured comment on every article contains “fake news”? 1000’s of comments. It has to be on purpose.
— Derek Spent (@derekspent) November 5, 2017
— joanna barrett (@jobrigitte) October 23, 2017
Facebook has since said they have ended the experiment, though more may very well be on the way.