Children need more "protection" online, an EP committee voted. (Phote Jelleke Vanooteghem via UnSplash)

News

EP calls for tightened ‘safety’ measures for minors online

Share

The European Parliament’s Internal Market and Consumer Protection Committee has approved a draft report on making online services safer for minors, in line with the controversial Digital Services Act (DSA). 

This morning, the committee backed the report calling for urgent action to strengthen protections for minors online. The study warned that current enforcement of the DSA is failing to shield children from addiction, mental health harm and exposure to illegal content.

Adopted with 32 votes in favour, five against and nine abstentions, the report urges the European Commission to accelerate enforcement of the DSA and close “legal gaps” that leave minors vulnerable to manipulative online practices.

It proposes a European Union-wide digital minimum age of 16 for access to social media, video-sharing platforms and AI “companions” unless parents provide consent, alongside a minimum age of 13 for any social media use.

With almost all young Europeans using the internet daily, studies claim that one in four exhibit problematic smartphone use linked to addictive design features.

MEPs highlighted “systemic failures” by major platforms to comply with DSA obligations, particularly in protecting children from addictive algorithms, engagement-based recommendations and deceptive “dark patterns” that manipulate users into sharing data or making unintended purchases.

The report demands a ban on the most harmful practices, including profiling-based recommendations for minors, gambling-like mechanics in games and the monetisation of “kidfluencers” — children acting as social media promoters.

It also calls for stricter rules on AI-powered risks, such as deepfake technologies and chatbots that may exploit children’s vulnerabilities or distort their behaviour.

While supporting the EC’s push for privacy-preserving age verification systems, MEPs warned that such measures must not undermine children’s rights or become surveillance tools.

They stressed that platforms remain primarily responsible for ensuring their services are safe by design, regardless of age checks.

The report further criticises what it calls the ineffectiveness of current parental controls, which are often difficult to use and easily bypassed by minors.

It calls for EU-wide media literacy programmes to empower children, parents, and educators. It notes that a new Eurobarometer survey found young people are increasingly reliant on digital sources, heightening exposure to disinformation and commercial exploitation.

“Our report clearly states the need for increased protection of minors online in two respects. Firstly, we need a higher bar for access to social media, which is why we propose an EU-wide minimum age of 16. Secondly, we need stronger safeguards for minors using online services,” said Rapporteur Christel Schaldemose, an MEP with the Socialists and Democrats Group (S&D).

“My report calls for mandatory safety-by-design and for a ban on the most harmful engagement mechanisms for minors. I’m proud that Parliament is taking this progressive step to raise the level of protection for minors.”

The full EP will vote on the recommendations during the November 24–27 plenary session.

If adopted, the measures will increase pressure on the European Commission to strengthen DSA enforcement and propose new legislation, such as the forthcoming Digital Fairness Act, to address gaps in consumer protection.

Just like the DSA, the report  calls on online platforms to adopt “safety by design” principles, including aggressive content moderation to shield minors from “misogynistic, racist, or homophobic views,” as well as “extreme content” and “disinformation”.

It offers no clear definitions, though, for terms such as “extreme” or “misinformation”, leaving platforms and regulators with broad discretion to remove or restrict content.

This ambiguity could lead to censorship, over-zealousness and politically motivated interference on both the young and, later, the old in the EU, it has been suggested.

Without precise legal definitions, it could lead to handing unaccountable power to platforms and bureaucrats to decide what people can see online, says the report.

It also calls for “rapid alert mechanisms” to flag and remove “dangerous trends”, including disinformation campaigns.

Some fear this could empower authorities to target controversial but lawful opinions, particularly on issues like migration, gender, or public health.

The DSA has been opposed by free-speech advocates and the US Government.