Business
Security researchers latest to blast UK’s Online Safety Bill as encryption risk
Nearly 70 IT security and privacy academics have added to the clamour of alarm over the damage the UK’s Online Safety Bill could wreak to, er, online safety unless it’s amended to ensure it does not undermine strong encryption.
Writing in an open letter, 68 UK-affiliated security and privacy researchers have warned the draft legislation poses a stark risk to essential security technologies that are routinely used to keep digital communications safe.
“As independent information security and cryptography researchers, we build technologies that keep people safe online. It is in this capacity that we see the need to stress that the safety provided by these essential technologies is now under threat in the Online Safety Bill,” the academics warn, echoing concerns already expressed by end-to-end encrypted comms services such as WhatsApp, Signal and Element — which have said they would opt to withdraw services from the market or be blocked by UK authorities rather than compromise the level of security provided to their users.
Last week Apple also made a public intervention, warning the Bill poses “a serious threat” to end-to-end encryption which it described as “a critical capability protection”. Without amendments to protect strong E2EE Apple suggested the bill risked putting UK citizens at greater risk — counter to the “safety” claim in the legislation’s title.
An independent legal analysis of the draft legislation also warned last year that the surveillance powers contained in the bill risk the integrity of E2EE.
The proposed legislation has already passed through scrutiny in the House of Commons and is currently at the report stage in the House of Lords — where peers have the chance to suggest amendments. So the security academics are hoping their expertise will mobilize lawmakers in the second chamber to step in and defend encryption where MPs have failed.
“We understand that this is a critical time for the Online Safety Bill, as it is being discussed in the House of Lords before being returned to the Commons this summer,” they write. “In brief, our concern is that surveillance technologies are deployed in the spirit of providing online safety. This act undermines privacy guarantees and, indeed, safety online.”
The academics, who hold professorships and other positions at universities around the country — including a number of Russell Group research-intensive institutions such as King’s College and Imperial College in London, Oxford and Cambridge, Edinburgh, Sheffield and Manchester to name a few — say their aim with the letter is to highlight “alarming misunderstandings and misconceptions around the Online Safety Bill and its interaction with the privacy and security technologies that our daily online interactions and communication rely on”.
Their core concern is over the bill’s push for “routine monitoring” of people’s comms, purportedly with the goal of combating the spread of child sexual abuse and exploitation (CSEA) content — but which the academics argue is a sledgehammer to crack a nut approach that will cause massive harm to the public and society in general by undermining critical security protocols that we all rely on.
Routine monitoring of private comms is “categorically incompatible with maintaining today’s (and internationally adopted) online communication protocols that offer privacy guarantees similar to face-to-face conversations”, they assert, warning against “attempts to sidestep this contradiction” by applying addition tech — either client-side scanning or so-called “no one but us” crypto backdoors — as “doomed to fail on the technological and likely societal level”.
“Technology is not a magic wand,” they emphasize, before offering succinct summaries of why the two possible routes to accessing protected private messages can’t be compatible with maintaining people’s right to privacy and security of their information.
“There is no technological solution to the contradiction inherent in both keeping information confidential from third parties and sharing that same information with third parties,” the experts warn, adding: “The history of ‘no one but us’ cryptographic backdoors is a history of failures, from the Clipper chip to DualEC. All technological solutions being put forward share that they give a third party access to private speech, messages and images under some criteria defined by that third party.”
On client side scanning, they point out that routinely applying such a tech to mobile users messages is disproportionate in a democratic society — amounting to surveillance by default — aka “placing a mandatory, always-on automatic wiretap in every device to scan for prohibited content”, as the letter puts it.
Nor is client-side scanning technology robust enough for what the bill demands in their expert analysis.
“This idea of a ‘police officer in your pocket’ has the immediate technological problem that it must both be able to accurately detect and reveal the targeted content and not detect and reveal content that is not targeted, even assuming a precise agreement on what ought to be targeted,” they write, warning that even client-side scanning tech that’s been designed to detect known CSEA has accuracy issues.
They also highlight recent research that such algorithms can be repurposed to add hidden secondary capabilities (such as facial recognition) and misused to power covert surveillance.
The academics are also concerned the bill will be used to push platforms to routinely run even more intrusive AI models that scan people’s messages for previously unseen but prohibited CSEA content. Such a technology does not exist in a “sufficiently reliable” form, they warn — meaning if the bill enforces such an implementation the likely upshot will be masses of false positives wreaking widespread harm as innocent message app users risk having their private messages widely viewed without cause, and could even face being falsely accused of viewing CSEA.
“This lack of reliability here can have grave consequences as a false positive hit means potentially sharing private, intimate or sensitive messages or images with third parties, like private-company vetters, law enforcement and anyone with access to the monitoring infrastructure. This may in itself constitute exploitation and abuse of those whose messages are being disclosed,” the experts warn.
They also note that such “far-reaching” client-side scanning AI models would require a higher level of flexibility that would also make it easier for them to be repurposed — “to expand their scope, by compromise or policy change” — raising the rights-chilling spectre of the scope of embedded CSEA scanning technologies being expanded to detect other types of content and UK citizens being subject to steadily greater levels of state-mandated surveillance by default.
We’ve reached out to the Department for Science, Innovation and Technology seeking the government’s response to the open letter.
-
Entertainment6 days ago
WordPress.org’s login page demands you pledge loyalty to pineapple pizza
-
Entertainment7 days ago
The 22 greatest horror films of 2024, and where to watch them
-
Entertainment6 days ago
Rules for blocking or going no contact after a breakup
-
Entertainment6 days ago
‘Mufasa: The Lion King’ review: Can Barry Jenkins break the Disney machine?
-
Entertainment5 days ago
OpenAI’s plan to make ChatGPT the ‘everything app’ has never been more clear
-
Entertainment4 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment5 days ago
How to watch NFL Christmas Gameday and Beyoncé halftime
-
Entertainment4 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end