The European Commission requested information on October 2 from YouTube, Snapchat and TikTok on the parameters used by their algorithms to recommend content to users and their role in amplifying the danger posed by so-called “systemic risks”.
The commission asked the platforms to provide detailed information about their role in amplifying certain systemic risks, including those related to the electoral process and civic discourse, users’ mental well-being and the protection of minors.
Requests, made under the Digital Services Act (DSA), “also concern the platforms’ measures to mitigate the potential influence of their recommender systems on the spread of illegal content, such as promoting illegal drugs and hate speech,” the commission said in a statement.
Additional information was requested from TikTok about measures the firm had adopted to keep bad actors from manipulating the application and to reduce risks related to elections and civic discourse.
By November 15, tech firms must provide the requested information, after which the commission will determine the next steps, potentially including fines.
The EU has previously opened non-compliance proceedings under the DSA, which requires Big Tech companies to do more to tackle illegal and harmful content on their platforms, related to the recommendations offered by Meta’s Facebook and Instagram, AliExpress and TikTok.