Logos of social media apps, that need encryption for privacy. (Photo illustration by Chesnot/Getty Images)


Encryption and child sexual material: Brussels struggling to digest ECHR ruling

"It is clear that the EU should, of course, cease its efforts towards the statutory obligation to decrypt end-to-end communications," Mikuláš Peksa, Czech MEP for the Pirate Party, tells Brussels Signal


The European Commission is struggling to reconcile a landmark European Court of Human Rights ruling on encryption with its own simultaneous efforts against child sexual material.

In the February 13 ruling on encryption, the European Court of Human Rights (ECHR) said laws weakening data protection violate human rights.

A proposed EU law meanwhile would require messaging providers to examine all their messages for possible child sexual abuse materials (CSAM).

This would require either technological backdoors or the loss of end-to-end encryption.

In a reaction to Brussels Signal, a Commission spokesperson was quick to emphasise “the Commission does not comment on specific judgments.”

It is for the countries concerned “to ensure compliance with the Court’s judgments,” the spokesperson added.

“The Commission 2022 proposal for a Regulation to prevent and combat child sexual abuse is technology-neutral. The proposal neither incentivises nor discourages the use of end-to-end encryption (E2EE),” said the spokesperson.

The EU’s proposals aim to fight child sexual abuse online, by imposing extensive new obligations on online companies to “detect, report, block and remove” CSAM items from their platform.

Some MEPs argue that the EU now needs to revisit these proposals in light of the ECHR judgement.

“It is clear that the EU should, of course, cease its efforts towards the statutory obligation to decrypt end-to-end communications,” Mikuláš Peksa, Czech MEP for the Pirate Party, tells Brussels Signal.

This requirement has been “deemed disproportionate, highlighting the need for a more balanced approach to digital privacy and security,” Peksa says.

Nonetheless, says Peksa, “it appears that some parts of the EU officials are intensifying their efforts to access data.”

This is “evident from recent developments, such as the establishment of the High-Level Group (HLG) on access to data for effective law enforcement.”

According to Peska, moves like this show “the drive for security measures can overshadow the fundamental privacy rights”.

He applauds the ECHR verdict, saying it “indeed marks a significant moment in the ongoing debate about privacy and security in the digital age.”

The EC spokesperson argued there are methods available that can identify images of child sexual abuse on end-to-end encrypted (E2EE) services.

“Furthermore, the proposed Regulation requires to consider the availability of sufficiently reliable detection technologies prior to issuing a detection order. This also applies to providers offering E2EE services,” the spokesperson added.

The Commission, however, points to part of the regulation safeguarding “the protection of privacy of users and security of communications”.

The European Council and European Parliament are now negotiating the proposed regulation.

The aim, says the spokesperson, is to ensure there will be no interruption and no legal gap in voluntary detection of child sexual abuse online.

However, Peska argues, “it is essential to emphasise that measures like CSAM scanning do not effectively protect children and are inefficient.

“Real protection comes from addressing the root causes and investing in preventive and supportive measures for at-risk individuals,” claims Peska.

The approach to combating such serious issues must be balanced with the privacy rights and the effectiveness of the measures implemented, says the MEP.

“In light of the ECHR’s verdict, it would be prudent for the EU to reconsider its stance on the CSAM-scanning proposal, taking into account the court’s emphasis on the disproportionate nature of decrypting end-to-end communications,” Peska said.