Fines, bans and directors liable: Government plans Big Tech crackdown
The government is set to announce sweeping new measures for social media companies, as part of a crackdown on big tech.
Digital Secretary Oliver Dowden and Home Secretary Priti Patel will today propose plans to hand Ofcom considerable powers over social media firms.
Ofcom handed new powers
The regulator will be able to fine companies failing in their duty of care up to £18m or 10 per cent of their annual global turnover.
It will also have the power to block non-compliant services from being accessed in the UK and the legislation includes provisions to impose criminal sanctions on senior managers.
The proposals are part of the government’s response to the Online Harms White Paper consultation, which Dowden has hailed as a “new age of accountability for tech to protect children and vulnerable users”.
The new regulations, which the government plans to bring forward in an Online Safety Bill next year, will apply to any company in the world hosting user-generated content online accessible by people in the UK.
“I’m unashamedly pro tech but that can’t mean a tech free for all,” said Dowden. “Today Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation.”
Chair of the DCMS Committee, Julian Knight MP welcomed the proposals but said “even hefty fines can be small change to tech giants and it’s concerning that the prospect of criminal liability would be held as a last resort.”
This Thursday the sub-committee on Online Harms and Disinformation will hear evidence from Facebook, Youtube and TikTok on steps they are taking to tackle harmful online content.
“The safety of our online communities – our users and our creators – is our top priority, and so we haven’t waited for legislation to act. We have worked with industry, community groups and the Government to tackle harmful content,” Ben McOwen Wilson, managing director of Youtube UK said.
Clampdown on illegal content
Social media sites and other similar services will now have a duty to limit the spread of illegal content, with the government calling on tech platforms to do more to protect children from being exposed to “harmful content”.
The government will adopt a categorised approach that will see companies have different responsibilities for different categories of content.
All companies “will need to take appropriate steps to address illegal content” such as child sexual abuse and terrorism. They will also be required to assess the likelihood of children accessing their services.
The role of social media has become particularly pertinent in recent weeks as the government attempts to battle coronavirus vaccine misinformation.
Under the new regulation, the most popular social media sites will be required to set out clear terms which state how they handle legal content but could cause “significant physical or psychological harm”. This includes disinformation and misinformation, “such as misleading content about coronavirus vaccines”.