Apple pushes back on controversial child protection features after privacy objections
Apple will delay the release of its controversial new child protection features in order to collect more feedback and make improvements, after a feature that would scan customers’ phones and computers for child sex abuse images sparked widespread criticism for breaching user privacy.
Over 90 policy and rights group worldwide, as well as Apple’s own employees, expressed concerns or called for the tech giant to abandon the photo scanning feature last month.
Before today’s changes, the features had been scheduled to roll out later this year.
In a statement released today, Apple said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement on Friday.
But the postponement of the features also received criticism from campaigners for child protection.
Andy Burrows, head of child safety online policy at the NSPCC, said in a tweet: “This is incredibly disappointing. Apple had adopted a proportionate approach that sought to balance user safety and privacy, and should have stood their ground.”
Critics of the firm’s plan have warned that repressive governments could use the technology to monitor the public, pointing out that whoever controls the database can search for whatever content they want.
Matthew Green, a researcher at John Hopkins University, said: “Whether they turn out to be right or wrong on that point hardly matters. This will break the dam – governments will demand it from everyone.”
Last month, Apple announced that its “NeuralHash” technology would allow them to detect known child sex abuse material (CSAM) images stored in iCloud photos. The automated system would perform on-device checks of photos before they are uploaded to the iCloud.
The system checks for matches with CSAM from a database compiled by the National Center for Missing and Exploited Children (NCMEC) and alerts human reviewers if illegal content is found. If the image is verified the reviewer contacts law enforcement.
When it announced the new features, Apple said that the method was “designed with user privacy in mind” and claimed their technology provides “an extremely high level of accuracy” with less than a one in a trillion chance of an incorrect flagging occurring,
“This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM” an Apple spokesperson said in a blog post.