Data watchdog targets social media with new Children’s Code
The UK data watchdog is setting its sights on social media and gaming companies with new regulations to protect children that come into force next week.
The Children’s Code brings in new rules around protecting data for online services such as apps, online games and web and social media sites likely to be accessed by children.
These include age verification checks and limits on what data is collected, as well as bans on location tracking and nudge techniques designed to encourage children to hand over more data.
The new rules, known formally as the Age Appropriate Design Code, were introduced in September 2020 with a 12-month grace period for companies to adapt to the changes.
Speaking earlier this year Information Commissioner Elizabeth Denham urged businesses to ensure they met the new standards before the deadline.
Breaches of the new code will carry the same punishment as the EU’s GDPR, including a fine of up to four per cent of global turnover for non-compliance.
“In the coming decade, I believe children’s codes will be adopted by a great number of jurisdictions and we will look back and find it astonishing that there was ever a time that children did not have these mandated protections,” Denham said.
“There is a more fundamental point here too: if we have a generation who grow up seeing digital services misuse their personal data, what does that do to their trust in innovation in the future?”
The crackdown comes amid wider scrutiny over the way social media giants protect users and tackle harmful material on their platforms.
Popular video sharing app Tiktok is being sued for billions of pounds over allegations it illegally harvested data of millions of European children.
The Chinese-owned platform, as well as other firms such as Youtube and Instagram, have all introduced stricter rules around children’s privacy in recent months ahead of the code coming into force.