Social media sites must take legal responsibility for child safety, states tech committee
Social media sites must take on legal responsibility for the health and wellbeing of children of their platforms, a parliamentary committee has found.
The Science and Technology Committee (STC) today said tech giants such as Facebook, Instagram and Snapchat must be subject to a formal legal duty of care for their users.
Read more: Social media sites should have legal duty over child safety, says children’s commissioner
The report shows while social media can have a positive impact, it can also have a harmful effect on the health and emotional wellbeing of young people.
Pre-existing issues such as sleep patterns, body image and bullying have been exacerbated by the rise of social media, the report said.
Online platforms have also led to a rise in grooming and child abuse, with the National Crime Agency reporting a 700 per cent increase in referrals of missing and exploited children over the last four years.
The STC said the social media firms must share data with researchers and called on the government to considering bringing in new laws to enforce data sharing.
The report also found the current regulation and legislation is insufficient and leads to a so-called standards lottery.
It said comprehensive framework must be brought in to plug gaps that allow sites such as YouTube, Google, Facebook and Twitter to escape regulation.
Norman Lamb, chair of the STC, said: “We understand their eagerness to protect the privacy of users but sharing data with bona fide researchers is the only way society can truly start to understand the impact, both positive and negative, that social media is having on the modern world.
“During our inquiry, we heard that social media companies had openly refused to share data with researchers who are keen to examine patterns of use and their effects. This is not good enough.”
The report echoes calls from the children’s commissioner Anne Longfield, who yesterday wrote an open letter calling for a statutory duty of care for social media companies and the establishment of a digital ombudsman.
The firms have come under increased scrutiny over the way they police material posted on their platforms following the death of 14-year-old Molly Russell.
Russell’s father has blamed Instagram for his daughter’s death after it emerged she had viewed images relating to self-harm on the photo sharing app prior to committing suicide.
Read more: Facebook’s popularity among children wanes as social media safety concerns deepen
A DCMS spokesperson said: “We have heard calls for an internet regulator and to place a statutory ‘duty of care’ on platforms, and are seriously considering all options.
“Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people.”
The government is due to publish a white paper on online harm in the coming weeks, which could lay out plans for new regulation.