“This document is the most terrifying thing I’ve ever seen”, Matthew Green, a cryptography professor at Johns Hopkins, tweeted after a leak of the legislation was shared. “It describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration.”
The regulation would not only scan for existing child abuse material, but also scan for new child sexual abuse material or grooming – thereby giving surveillance powers to authorities to scan conversations happening within some of the most popular platforms in the world should they receive a “detection order” that would use artificial intelligence to scan pictures and text messages.
“Detection orders are limited in time, targeting a specific type of content on a specific service”, the European Commission says, adding that they will be issued by courts or independent national authorities.
“Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”
It also states that app stores must ensure children cannot download apps that may expose them to a high risk of solicitation.
Experts warned that the introduction of the powers for European governments would make them available to other governments. “By legally mandating the construction of these surveillance systems in Europe, the European government will ultimately make these capabilities available to every government”, Professor Green wrote.
Monitoring private conversations is “intrusive” but this is an acceptable balance because the algorithms do not “understand” the conversations.
Am I losing my mind, did actual thinking human beings write this text? pic.twitter.com/nON5xzPJM7
— Matthew Green (@matthew_d_green) May 11, 2022
Other privacy experts have echoed this criticism. The proposal is “incompatible with end-to-end encryption and with basic privacy rights,” Joe Mullin, senior policy analyst at the digital rights group Electronic Frontier Foundation, told CNBC.
“There’s no way to do what the EU proposal seeks to do, other than for governments to read and scan user messages on a massive scale,” Mullin continued. “If it becomes law, the proposal would be a disaster for user privacy not just in the EU but throughout the world.”
Many governments – including the UK, the US, and across Europe – have attempted to erode privacy for users by asking technology giants to put backdoors in end-to-end encrypted chats, a move that would essentially make them more vulnerable to criminals.
While the plan does not outright call for an end to end-to-end encryption, it is unclear how it would be carried out without undermining it.
“Criminals are already using distribution channels that would not be affected by these scans and will easily escape scans in the future,” Linus Neumann of the German hacker collective Chaos Computer Club said.
WhatsApp head, Will Cathcart, also criticised the bill in a Twitter thread. “Incredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption”, he wrote. “If the EU mandates a scanning system like this be built for one purpose in the EU, it will be used to undermine human rights in many different ways globally.”
He continued: “Legislators need to work with experts who understand internet security so they don't harm everyone, and focus on ways we can protect children while encouraging privacy on the internet.”
As is, this proposal would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk. This is wrong and inconsistent with the EU's commitment to protecting privacy.
— Will Cathcart (@wcathcart) May 11, 2022
Apple had previously attempted to introduce anti-child abuse features that used similar technology. The tools would have attempted to detect when children are being sent inappropriate photos, and when people have child sexual abuse material on their devices.
However, critics said that the tools could be used to scan for other kinds of material and undermined Apple’s public commitment to privacy as a human right. In September, Apple said that it would indefinitely delay those features until they were improved.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features”, the iPhone giant said in a statement.