A ‘spy clause’ to break encrypted texts while preserving privacy is still a dream
After years of wrangling, the Online Safety Bill is still caught on impractical plans to break through end-to-end encrypted messages like WhatsApp, writes Eliot Wilson
Tomorrow the government’s Online Safety Bill comes back to the House of Commons. This measure is now a veteran of Westminster: it was introduced in March 2022 and has had a turbulent passage through Parliament. It was victim to brutal surgery during the committee stage, and another overhaul last week in the House of Lords.
One of the most contentious parts of the bill is a clause dubbed by some campaigners as the “spy clause”. This would allow Ofcom to identify and take down content sent under end-to-end encryption, on platforms like WhatsApp and Signal, if it related to terrorism or child sexual exploitation and abuse (CSEA), and has been stiffly resisted not only by the tech industry but by civil rights campaign groups like Amnesty International. Both Meta, which owns WhatsApp, and Signal have said that they would rather have their services blocked in the UK than submit to the intrusion on encrypted content.
Governments are bad at regulating technology. Legislation is necessarily a blunt instrument—everything must be set down explicitly and precisely, and there is no room for nuance—and it is often the case that the drafters of that legislation lack the depth of expertise of the industries which it is supposed to regulate. But a statement by a minister in the House of Lords last Wednesday demonstrated the fundamental weakness in the provisions of the bill.
Lord Parkinson of Whitley Bay tacitly admitted what campaigners had been arguing for months. “A notice [to scan encrypted data] can only be issued where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content.” What this alludes to but does not say explicitly is that such technology does not currently exist, and is unlikely to be developed for many years, if at all.
The result is what Boris Johnson might have called a bugger’s muddle. The government insists its position “has not changed”, because to change one’s mind is now one of the highest crimes in politics, and the text of the clause remains in the bill. It will, however, be entirely declaratory, granting Ofcom the power to do something which is technologically not possible (but could, hypothetically, be possible in the future). One could see this is a marvellously British compromise: the government can say that it has introduced tough measures to deal with terrorist and CSEA content online, while the tech companies do not need to change their practices or their relationships with their customers.
I am not blind to the value of compromise. Much of our constitutional settlement depends on an unspoken gulf between theory and practice. But on this occasion, it seems to me that the government has dodged an argument which will have to be faced sooner or later: where is the border between “safety” (whether for children or potential victims of terrorism) and privacy? Let us suppose, for a moment, that the technology Ofcom would require (“magical thinking”, as tech expert Heather Burns sharply described it) did indeed exist. Would it be right for Ofcom to have the power to monitor end-to-end encrypted content?
No one wants child abuse or terrorism to flourish. The government knows that, and has used the emotional leverage to boost support for this ill-fated clause. But that cannot be an unanswerable argument. At some point, if we do not want to live in a society of perpetual and pervasive surveillance, we have to draw lines, and I think we are entitled, as citizens, to have means of communications which the government cannot intercept or monitor. Even if there was clear, unambiguous evidence—and there is not—that giving Ofcom these powers would make a real difference to these dreadful offences, it would not justify the violation of our fundamental rights.
Section 122 of the Online Safety Act 2023 (as the clause will become) may become a strange, ghostly dead letter, an instrument of a policy which was dreamed but never realised. That is the right result. How much it matters that it has been reached in the wrong way is personal taste, but one hopes that the government has noted the fierce reaction to its legislative overreach, and lodges that fact in Whitehall’s collective memory.