Tiktok tightens privacy features as online harm laws loom
Tiktok has tightened its privacy features for young users as the government prepares to roll out tough new rules to improve safety on social media.
The hit video-sharing app today said that all users under the age of 16 will have their accounts set to private by default, meaning only approved followers can view their videos.
Comments on videos created by users under 15 will also be restricted so that younger users can only receive comments from friends.
Other changes announced today include a block on downloading videos made by children and reducing how often young users’ accounts are recommended to other people on the app.
Tiktok said the package of measures was designed to drive up standards of user privacy and safety on the app.
“We want to encourage our younger users to actively engage in their online privacy journey, and by doing so early we hope to inspire them to take an active role and make informed decisions about their online privacy,” said Tiktok’s head of privacy Elaine Fox.
Tiktok, which is wildly popular among teenagers and has seen its user numbers surge during the Covid-19 pandemic, has come under fire amid concerns about harmful material circulating on the app.
Previous efforts to improve safety include banning direct messaging for under-16s and allowing parents to control features on their child’s account remotely.
Tiktok states that it is only for people over the age of 13, but it has no way of verifying this or preventing children from skirting the rules by inputting a false date of birth.
The latest measures come as the government prepares to introduce new online harms legislation aimed at holding tech firms such as Tiktok responsible for the material posted to their platforms.
The new laws, overseen by Ofcom, are set to give social media companies a legal duty of care over their users, with significant fines for non-compliance.