Government set to hold social media firms responsible for harmful content
Bosses of social media giants such as Facebook could be held responsible for rampant harmful content on their platforms, the government is set to warn next week.
The Department for Digital, Culture, Media and Sport will legislate for a new statutory duty of care which will be funded through a levy on media companies, according to a leaked version of an upcoming white paper seen by the Guardian.
The rules would be policed by Ofcom, until such time as a new regulator can be established. The body will have the power to levy substantial fines against both firms and executives that breach the law.
Read more: Social media firms could face ban if they fail to remove harmful online content
The report was launched after the death of teenager Molly Russell, whose parents said she had committed suicide after viewing images of self-harm online.
Former digital secretary and current health secretary Matt Hancock said at the time of Russell's death that parliament could opt to ban companies that do not comply with the new laws, though he added: "It's not where I'd like to end up."
It also comes after shootings at several mosques in Christchurch last month, which the attacker live-streamed on Facebook.
DCMS could not be reached for comment last night.
Read more: Zuckerberg: Governments must help web giants regulate harmful content
Facebook founder Mark Zuckerberg called on regulators and governments last week to take a “more active role” in controlling content on the internet.
Writing in the Washington Post, Facebook’s founder said firms like his could not be expected to shoulder all the responsibility for moderating harmful content.
“Every day, we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks. These are important for keeping our community safe,” he wrote.
“But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone.”