The European Commission on Wednesday proposed new regulation that would require tech companies to scan for child sexual abuse material (CSAM) and grooming behavior, raising worries that it could undermine end-to-end encryption (E2EE).
To that end, online service providers, including hosting services and communication apps, are expected to proactively scan their platforms for CSAM as well as report, remove and disable access to such illicit content.
While instant messaging services like WhatsApp already rely on hashed versions of known CSAM to automatically block new uploads of images or videos matching them, the new plan requires such platforms to identify and flag new instances of CSAM.
“Detection technologies must only be used for the purpose of detecting child sexual abuse,” the regulator said. “Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”
A new EU Centre on Child Sexual Abuse, which will be independently established to enforce the measures, has been tasked with maintaining a database of digital “indicators” of child sexual abuse, in addition to processing and forwarding legitimate reports for law enforcement action.
In addition, the rules require app stores to ensure that children are refrained from downloading apps that “may expose them to a high risk of solicitation of children.”
The controversial proposal to clamp down on sexual abuse material comes days after a draft version of the regulation leaked earlier this week, prompting Johns Hopkins University security researcher Matthew Green to state that “This is Apple all over again.”
The tech giant, which last year announced plans to scan and detect CSAM on its devices, has since delayed the rollout to “take additional time over the coming months to collect input and make improvements.”
Meta, likewise, has postponed its plans to support E2EE across all its messaging services, WhatsApp, Messenger, and Instagram, until sometime in 2023, stating that it’s taking the time to “get this right.”
A primary privacy and security concern arising out of scanning devices for illegal pictures of sexual abuse is that the technology could weaken privacy by creating backdoors to defeat E2EE protections and facilitate large-scale surveillance.
This would also necessitate persistent plain-text access to users’ private messages, effectively rendering E2EE incompatible and eroding the security and confidentiality of the communications.
“The idea that all the hundreds of millions of people in the E.U. would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented,” Ella Jakubowska, a policy advisor at European Digital Rights (EDRi), told Politico.
But the privacy afforded by encryption is also proving to be a double-edged sword, with governments increasingly fighting back over worries that encrypted platforms are being misused by malicious actors for terrorism, cybercrime, and child abuse.
“Encryption is an important tool for the protection of cybersecurity and confidentiality of communications,” the commission said. “At the same time, its use as a secure channel could be abused of by criminals to hide their actions, thereby impeding efforts to bring perpetrators of child sexual abuse to justice.”
The development underscores Big Tech’s ongoing struggles to balance privacy and security while also simultaneously addressing the need to assist law enforcement agencies in their quest for accessing criminal data.
“The new proposal is over-broad, not proportionate, and hurts everyone’s privacy and safety,” the Electronic Frontier Foundation (EFF) said. “The scanning requirements are subject to safeguards, but they aren’t strong enough to prevent the privacy-intrusive actions that platforms will be required to undertake.”