Facebook launches UK fact-checking programme to fight fake news
Facebook today said it will roll out its fact-checking service in the UK to help deal with the fake news that has plagued its platform in recent years.
The social media giant will work with fact-checking charity Full Fact to review and rate the accuracy of posts on its site.
Facebook said the programme will focus on the most harmful fake and misleading content, such as bogus cancer cures and inaccurate information about terror attacks and elections.
Under the new scheme users will be able to flag dubious content. A team of fact-checkers will then review the post and give it an accuracy rating.
The tech firm will not stop people from sharing inaccurate content, but it will move the offending posts further down the news feed, meaning fewer people will view it.
Sarah Brown, training and news literacy manager at Facebook, said: “People don’t want to see false news on Facebook, and nor do we. We’re delighted to be working with an organisation as reputable and respected as Full Fact to tackle this issue.
“By combining technology with the expertise of our fact-checking partners, we’re working continuously to reduce the spread of misinformation on our platform.”
The move comes amid growing scrutiny of Facebook’s role in spreading misinformation.
The company has been blasted over campaign ads run on its platform during the Brexit campaign and the 2016 US presidential election.
A study published yesterday by New York and Princeton Universities revealed older users are most likely to share fake news on Facebook.
Full Fact director Will Moy said: “Fact-checking isn’t glamorous. It can take hours, days or weeks, so nobody has time to properly check everything they see online.
“But it’s important somebody’s doing it – because online misinformation, at its worst, can seriously damage people’s safety or health.”
Facebook said fact-checkers will only review content presented as fact-based reporting, with opinions and satire exempt from the programme.