Kudos for OnlyFans: Watchdog says smaller video platforms need to do more to protect kids
New research from Ofcom has found that smaller adult video-sharing platforms need to do more to prevent children from accessing porn.
In a report published today, the media regulator said it was concerned that whilst smaller companies had some age verification measures in place, they did not go far enough.
One unnamed adult platform told Ofcom that it had decided against robust age verification because it would cut into profitability.
Some sites simply relied on self-declaration of age as the barrier for entry for youngsters — something it found troubling considering a report last month found that a third of under-18s lied about their age when using social media.
Following the report, the regulator said it has opened a formal investigation into one firm, Tapnet – which operates the small adult site RevealMe.
Bigger tech firms seemed to come out quite favourably, with Ofcom noting both Snap and TikTok’s parental measures and content oversight strategies as helpful for protecting children.
“The report from Ofcom addresses a number of important industry issues. In particular, our approach around moderation of public content, parental tools and proactive education campaigns are recognised positively in the report, a Snap spokesperson told City A.M.
Meanwhile OnlyFans, the biggest video-sharing platform, was praised for its use of third-party verification tools like Yoti, which carry out a facial age estimation checks.
OnlyFans’ Chief Strategy & Operations Officer Keily Blair told City A.M. that the company “aims to be the safest digital media platform in the world”.
“We will continue to go above and beyond the legal requirements, and our peers, to provide a safe platform for Creators and Fans while maximizing their freedom to control and monetize the lawful content they produce and view on OnlyFans,” she said.
Ofcom chief Dame Melanie Dawes said today’s report exposes gaps in the sector and “puts UK adult sites on notice” to stop putting profits before child safety.
“We’ve used our powers to lift the lid on what UK video sites are doing to look after the people who use them. It shows that regulation can make a difference, as some companies have responded by introducing new safety measures, including age verification and parental controls,” Dawes said.
The report also comes as the online safety bill continues to make its way through parliament, which will give the watchdog more powers to hold social media firms to account.
Ofcom already has some power to regulate platforms to ensure that they protect people from harmful videos, but early forms of the new online safety bill have suggested that these powers could be pushed even further.
However, it is unclear whether the new culture secretary Michelle Donelan is keen to push forward with this measure. She has already said that the rules requiring tech firms to tackle “legal but harmful” material would be altered.