French President Emmanuel Macron says Google, Meta, and Elon Musk’s X have failed to properly moderate their online content.
None of the three companies bothered turning up to an event on online moderation the Élysée held last week.
“When companies that were committed” to coming, then “organise themselves not to be there and do not do what they said, trust is reduced,” Macron told reporters.
He is particularly frustrated with X’s alleged failure to take European content moderation demands seriously, claiming the company only has a few dozen French-speaking moderators on staff.
“When I am told that there are 52 moderators for X in French, I have difficulty considering that the content is moderated,” he adds.
Noting there are around “330 million French language speakers” worldwide, “these 52 people are either geniuses or have a lot of work”, says Macron.
Emphasising the need to “fight against terrorist and violent extremist content online”, Macron warns the firms’ apparent lack of respect will have consequences.
“The second people are no longer serious, we become less cooperative,” he says.
The European Commission has signed an agreement with France and Ireland aimed at getting the two countries to help enforce the bloc’s online censorship rules. https://t.co/KU75AjWEXE
— Brussels Signal (@brusselssignal) October 23, 2023
Macron was seemingly of the opinion Chinese social media platform TikTok is taking online moderation, with critics view as censorship, more seriously than many of its Western counterparts.
TikTok “already has 687” French-language moderators working for them, according to Macron.
By revealing TikTok’s number of French-language moderators, he hopes to encourage other companies to take on more staff to moderate posts, he says.
EU officials are also continuing to demand social media firms crack down on purported “illegal content” and “disinformation” under the Digital Services Act (DSA).
The European Commission has most recently sent requests for information to Meta and Snapchat owner Snap Inc over possible breaches of the DSA, though these related to minors’ use of their platforms.
Both companies now have until 1 December to hand over information relating to the “risk assessments and mitigation measures” they implemented “to protect minors online”.
Commission officials are particularly concerned with the protections in place to mitigate “risks to mental health and physical health” the social media platforms pose to minors, they say.
Meta is already under preliminary investigation over insufficiently moderating illegal content available on its platform within the European Union, especially in relation to content depicting Hamas terrorism.
Czech MEP Mikuláš Peksa has lambasted the European Union’s attempts at online censorship. @vonpecka | @EuropeanPirates https://t.co/NcNGQbj3NA
— Brussels Signal (@brusselssignal) September 26, 2023