French President Emmanuel Macron has lashed out at several big tech firms for failing to properly censor the internet. (EPA-EFE/MOHAMMED BADRA / POOL)


Macron attacks big tech for not attending his moderation event

Google, Meta, and Elon Musk's X did not turn up for an event on online moderation the Élysée held last week


French President Emmanuel Macron says Google, Meta, and Elon Musk’s X have failed to properly moderate their online content.

None of the three companies bothered turning up to an event on online moderation the Élysée held last week.

“When companies that were committed” to coming, then “organise themselves not to be there and do not do what they said, trust is reduced,” Macron told reporters.

He is particularly frustrated with X’s alleged failure to take European content moderation demands seriously, claiming the company only has a few dozen French-speaking moderators on staff.

“When I am told that there are 52 moderators for X in French, I have difficulty considering that the content is moderated,” he adds.

Noting there are around “330 million French language speakers” worldwide, “these 52 people are either geniuses or have a lot of work”, says Macron.

Emphasising the need to “fight against terrorist and violent extremist content online”, Macron warns the firms’ apparent lack of respect will have consequences.

“The second people are no longer serious, we become less cooperative,” he says.

Macron was seemingly of the opinion Chinese social media platform TikTok is taking online moderation, with critics view as censorship, more seriously than many of its Western counterparts.

TikTok “already has 687” French-language moderators working for them, according to Macron.

By revealing TikTok’s number of French-language moderators, he hopes to encourage other companies to take on more staff to moderate posts, he says.

EU officials are also continuing to demand social media firms crack down on purported “illegal content” and “disinformation” under the Digital Services Act (DSA).

The European Commission has most recently sent requests for information to Meta and Snapchat owner Snap Inc over possible breaches of the DSA, though these related to minors’ use of their platforms.

Both companies now have until 1 December to hand over information relating to the “risk assessments and mitigation measures” they implemented “to protect minors online”.

Commission officials are particularly concerned with the protections in place to mitigate “risks to mental health and physical health” the social media platforms pose to minors, they say.

Meta is already under preliminary investigation over insufficiently moderating illegal content available on its platform within the European Union, especially in relation to content depicting Hamas terrorism.