EU Commission Executive Vice-President for A Europe Fit for the Digital Age and Competition Margrethe Vestager and European Union Commissioner for Internal Market, Thierry Breton (R) give a press conference on important developments under the Digital Markets Act in Brussels, Belgium, 25 March 2024. EPA-EFE/OLIVIER HOSLET

News Vote 24

European Commission publishes ‘disinformation’ guidelines for Big Tech

There are "systemic risks online that may impact the integrity of elections", the European Commission has warned


With European elections only two months away, and increasing worries about cyber threats, the European Commission has published recommended anti-disinformation measures for social media and search engines.

There are “systemic risks online that may impact the integrity of elections”, warns the Commission.

It wants to “guide” Very Large Online Platforms and Search Engines through the June elections, it says.

If the Commission believes big tech companies have failed to obey the requirements, the tech companies risk considerable fines of up to six per cent of their global annual turnover under the Digital Services Act (DSA).

As part of the EU-wide regulation, designated platforms with over 45 million active users in the EU are required to address online dangers associated with the bloc’s electoral processes, as well as uphold fundamental rights, such as freedom of expression.

As a result, platforms including Facebook, Instagram, Google Search, YouTube, LinkedIn, TikTok, and Elon Musk’s X all have a legal duty in the EU to obey such rules or risk sanction.

This can mean large platforms face difficult judgment calls between protected political content, such as political satire protected by free speech, and “harmful political disinformation”, which is “crafted with the intent to sway voters and manipulate elections”.

If content constitutes disinformation or an attempt at election interference, then it falls under DSA law, with platforms facing duties under EU law.

In all cases, platforms need to “implement elections-specific risk mitigation measures tailored to each individual electoral period and local context”.

The “mitigation measures” can include promoting official information on electoral processes, media literacy initiatives, and adapting their algorithms to stop content that threatens the integrity of electoral processes from going viral.

Platforms also need to identify advertisements as such, and give users a bigger say over algorithms, to reduce “the monetisation and virality of content that threatens the integrity of electoral processes.”

Generative AI gets particular attention, with platforms needing to clearly label it so deepfake videos do not confuse viewers.

The Commission sets out its complete recommendations and best practices in an annex.

Tech companies will need to “cooperate with EU level and national authorities, independent experts, and civil society organisations”, it says.

As for identified disinformation and foreign information manipulation and interference (FIMI), they will need to carry fact-checking labels “provided by independent fact-checkers and fact-checking teams of independent media organisations.”

The Commission said its guidelines were its attempt to encourage effective information sharing before, during, and after the June European election, making it easier for platforms to implement suitable countermeasures around cybersecurity, disinformation, and FIMI.

Some subjects, however, may be tricky for platforms to address.

Examples here include “forms of racism, or gendered disinformation and gender-based violence”.

Platforms also could face difficult judgment calls with “public incitement to violence and hatred to the extent that such illegal content may inhibit or silence voices in the democratic debate, in particular those representing vulnerable groups or minorities”.

Tech companies will need to adopt specific measures, such as a response mechanism during elections, to reduce the impact of incidents like these on the outcome of elections.

The measures must be “reasonable, proportionate, and effective,” and respect relevant prior regulations, says the Commission.

Technically speaking, these recommendations for election security are still in their draft form.

The Commission says it expects them to be formally adopted in April.