Meredith Whittaker, the President of the Signal Foundation, delivered the closing keynote speech at EDRi’s 20th-anniversary celebration in March 2023. The tech professional focused on the “recent spate of regulatory proposals and misguided tech fixes [like the EU’s Child Sexual Abuse Regulation] that offer false and surveillant solutions to complex social problems – solutions that always seem to lump the right to privacy in with malfeasance, and offer to address bad actions by eliminating privacy.”
By EDRi · April 13, 2023
Technosolutionism won’t fix societal issues
The European Commission’s proposal ‘laying down rules to prevent and combat child sexual abuse’ (CSAR) intends to protect children. However, this law would allow authorities to have anyone’s legitimate conversations monitored. In doing so, it would harm everyone, including those it wants to protect. The EDRi network has been actively pushing European lawmakers to reject this law, which tries to resolve an inherently societal problem with evidently faulty technology, and replace it with a rights-respecting, effective alternative proposal.
During EDRi’s 20th anniversary celebration, Signal’s Meredith Whittaker voiced the concerns that digital rights organisations like EDRi and our partners have been raising around the importance of ensuring the protection of our private, secure communications through encryption. She pointed out that “the measures being considered in Europe, the UK, and beyond [like the Child Sexual Abuse Regulation] – however noble the language justifying them appears – will pave the way for dark futures.”
Whittaker told the story of Jessica Burgess, a 41-year-old mother from Nebraska. In 2022, Facebook handed messages between Jessica and her daughter to law enforcement. And these were used to charge her with a felony for helping her daughter get reproductive healthcare in a state where such care had been made suddenly illegal.
Through that example, Whittaker explained that “complex social problems absolutely need serious redress. But using these problems as emotionally evocative pretexts to justify the elimination of privacy will not solve them.”
In democratic societies, trust in communication systems is vital for our lives and connections with others. This allows us to work, socialise, organise, express ourselves, and care for each other safely, without being put under arbitrary suspicion. Our privacy and security must be protected online, as they are offline, to ensure that our most sensitive conversations are not subject to unwarranted intrusion.
Why will the CSA Regulation never work technologically?
Whittaker explained how the CSA Regulation mandates practices that would be impossible to implement without weakening or eliminating end-to-end encryption, calling claims that this is feasible “magical thinking”.
The European Commission’s proposal fails to provide any evidence for the claims it is making about how technology works. Cybersecurity experts and technologists all around the world have warned that you cannot safely or effectively circumvent encryption in this way. Such interference with the encryption process will also make everyone’s phones vulnerable to attacks from malicious actors.
“Like the CSAR legislation, many claim without evidence that what they propose is compatible with end-to-end encryption, even as they mandate practices that would be impossible to implement without weakening or eliminating end-to-end encryption. This is like a boss giving an employee two days’ worth of work to complete and saying “I would never force you to work for two days straight, I am just telling you to complete this all in one day.” It’s not possible, and saying it is doesn’t change that.”
Technology experts, including EDRi, have repeatedly shared their concern that such detection measures will inevitably lead to dangerous and unreliable Client-Side Scanning practices, undermining the essence of end-to-end encryption.
“Client-side scanning is a Faustian bargain that nullifies the entire premise of end-to-end encryption by mandating deeply insecure technology that would enable the government to literally check every utterance before it is expressed.”
Despite civil society’s efforts to talk to the European Commission to point out the dangers the CSA Regulation would create, the EU Commissioner in charge of the law, Home Affairs Commissioner Ylva Johansson refused to meet with independent tech experts and civil society representatives. Instead, she took numerous meetings with the private sector and celebrities.
As Whittaker suggests in her speech, it’s absurd to hear “celebrities and influencers, not to mention politicians, claim that technological solutions exist that can scan content for forbidden expression without breaking end-to-end encryption.”
“I’m not a celebrity or an influencer, but I do know tech, and I will state for the record that there is no such thing. It’s simply not possible. And either these people are badly misinformed, in a deep and concerning state of denial, or dangerously cynical – hoping that by promising a nonsense tech solution they will get laws passed and implement surveillance before anyone is the wiser.”
What’s coming next?
With the legislative process well underway, we now see Members of the European Parliament (MEPs) wrestle with the proposal. Several Parliamentary committees have already issued their draft opinions, with the lead committee – the Civil Liberties Committee – poised to present their first approach later this month. However, independent research conducted on behalf of the Civil Liberties Committee has dealt a damning blow to the Commission’s proposal, warning of serious human rights violations. It remains to be seen how MEPs will navigate the proposal’s magical thinking.
In the meantime, EDRi has mobilised the Stop Scanning Me movement, which includes over 130 NGOs and 10 000 individual supporters to ensure that EU lawmakers protect everyone on the internet, including children, by rejecting the CSA Regulation.