Explainer: Snapchat under the spotlight for high number of underage users
Parents concerned about the online safety of their children are an issue as old as the Internet, but one that’s been aggravated by the popularity of social media. Snapchat looks to be the last casualty of scrupulous regulators in the UK.
The Information Commissioner’s Officer, the data regulator in the UK, is gathering information about the messaging app as it’s concerned that thousands of kids under 13 are using it.
Back in March, it was reported that Snapchat had removed only a small number of the underage accounts present in the UK. If the data regulator is not satisfied with the information it finds, it will likely launch an investigation into the app. When an investigation finds evidence of wrongdoing, the company usually gets a fine.
According to Ofcom, Snapchat is the preferred app for underage users. While older demographics might be found on Twitter and Facebook, kids faking their age has been a constant problem for other social media platforms like Instagram and TikTok.
Snapchat does have some measures in place to protect users who are under 18 and a spokesperson for tech firm insisted “we take our obligations to keep under 13s off of Snapchat seriously”. For example, adult users are not allowed to add individuals who are 17 and under unless they have a certain number of friends in common. Last summer, it also introduced parental controls, following other platforms like Instagram and TikTok.
But the social media has been repeatedly accused of not doing enough to keep underage users away, compared to some of its competitors. Data from Ofcom showed Snapchat removed only 700 suspected underage accounts in the UK between April 2021 and April 2022. By contrast, TikTok removed 180,000 accounts during the same period.
Associations worldwide, including the American Psychological Association, have warned about the risk social media can pose to teens’ mental health, especially in cases of extreme exposure to violent or self-harm-related content. The risks are amplified when a ten year old is exposed to this type of content.
Social media is not inherently bad, despite what the narrative often makes it look like. But the burden of establishing safety boundaries is not only with parents, teachers and government, it is firstly with the apps themselves.
The data watchdog findings will show whether Snapchat was right in claiming it is doing its best, or whether – and more likely – there’s a lot more that can be done to keep young kids away from content and apps that were never tailored for them.