Child safety campaigners criticise Meta’s end-to-end encryption plans
Child safety campaigners have criticised Meta’s recent research that makes the case for end-to-end encryption as a positive driver for human rights.
In an independent report commissioned by Meta and produced by Business for Social Responsibility (BSR), a nonprofit focused on corporate impacts, it stated that end-to-end encryption is positive and crucial for protecting human rights.
End-to-end encryption stops platforms from accessing users’ communications and Meta plans to roll it out on all of its messaging platforms in the coming years.
As it stands, encryption is widely employed on Meta-owned WhatsApp, but has not yet been executed on Facebook Messenger or Instagram Direct Messenger.
The main argument against end-to-end encryption is that it limits law enforcements’ ability to access communication, and leaves people vulnerable to child abuse and extremism.
Andy Burrows, Head of Child Safety Online Policy at NSPCC said: “Three years after Meta announced its intention to roll out end-to-end encryption, the company has published a wholly insufficient plan to tackle the child abuse risks that will result if it is fully implemented across their platforms. This limited approach will be likely to lead to more child abuse taking place and less grooming being detected.”
“Meta will put the onus on children to report their own abuse which either shows a complete lack of understanding of safeguarding or a lack of compassion for children being sexually abused on its products”, he added.
He said that Meta has “no coherent solutions”, and called on the government to fast-track powers for Ofcom under the Online Safety Bill to make end-to-end encryption on Meta’s platforms work for the safety and privacy of all users.