Campaigners call on Apple to abandon surveillance plans
An international coalition of more than 90 civil rights and policy groups is calling on Apple to scrap plans to surveil its US customers’ photos.
Earlier this month, Apple announced plans to search people’s devices for child sex abuse material (CASM) using new “NeuralHash” technology to perform on device checks. The automated system will alert human reviewers if illegal content is found who contact law enforcement if the image is verified.
Today civil society groups, organised by the Center for Democracy and Technology, published an open letter warning Apple that its new algorithm lays the foundation for “censorship” and “persecution.”
The groups wrote: “Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable.”
In documentation released last week, Apple revealed that new security features can also alert parents if children under the age of 13 choose to view or send sexually explicit content.
The letter raised concerns that technology for explicit image detection is “notoriously unreliable” and could wrongly flag sexually explicit images used in art, health information and educational material which children have a right to view.
They also said the feature might be detrimental to child welfare and said LGBTQ+ youths with “unsympathetic parents” could be put at particular risk.
While Apple is yet to respond to the letter they have previously stated that the new technology was “designed with user privacy in mind” and insisted it has “an extremely high level of accuracy.”
The company has said that the new features are due to be rolled out to US iPhones later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monetary.
Read more: Apple child protection features spark concern among its own staff