European Union lawmakers have moved to block untargeted mass scanning of private communications, a practice that would have allowed authorities to screen messages of individuals not suspected of any wrongdoing.
The European Parliament voted to limit child sexual abuse material (CSAM) detection to cases that are proportional and targeted.
Voluntary scanning technologies are now restricted to material already identified as CSAM or content flagged by users, trusted flaggers, or relevant organisations.
Without further agreement, US tech giants such as Meta, Google, and Microsoft will have to stop indiscriminately scanning the private chats and photos of European citizens from April 4.
Any detection of CSAM must be both “proportional and targeted” and end-to-end encrypted services such as WhatsApp, Signal and Telegram would fall outside the scope.
German digital rights activist and former MEP Patrick Breyer welcomed the result of the vote: “The mass surveillance of our chats on US platforms has never made a significant contribution to rescuing abused children.
“Instead, it has criminalised thousands of teenagers and severely overloaded our police forces,” he said.
“If investigators are no longer drowning in a flood of false suspicion reports, resources will finally be freed up to hunt down organised abuse rings in a targeted and covert manner. That is what truly protects children,” he added.
Tensions remain within the European Parliament itself, though, as Rapporteur Birgit Sippel signalled support for broader, untargeted scanning of known material.
Earlier this week, trialogue negotiations between EP, European Council and European Commission on chat control collapsed thus preventing any renewal of this exceptional regime, which enables communications platforms to detect CSAM.
Sippel criticised member states for what she called a “lack of flexibility”, warning that the failure to reach agreement would have immediate consequences.
“It is unfortunate that Parliament and Council could not reach an agreement,” she said.
“Member states have deliberately accepted that the interim regulation will expire in April. From then on, voluntary scanning to counter the dissemination of child sexual abuse material online by the providers will no longer be possible,” she insisted.
As a result, voluntary scanning by online platforms will lose its legal basis under EU privacy rules.