TikTok is pushing back against a series of recent reports accusing the platform of favouring right-wing political content, calling the studies “methodologically flawed”. Photo illustration by Ezra Acayan/Getty Images)

News

Flawed methods or right-wing tilt? TikTok’s ‘political influence’ under NGO’s scrutiny

Share

NGO Global Witness has alleged that TikTok’s algorithm disproportionately promoted “far-right political content”.

The platform denied the allegations, citing what it called serious methodological flaws in the allegation and sparking a deeper debate over transparency and political influence on social sites.

TikTok has pushed back against recent reports accusing the platform of favouring right-wing political content, calling the studies “methodologically flawed”.

It questioned the NGO’s methodology, arguing that the test was deeply unscientific and reflected a complete lack of understanding of how recommendation systems worked more broadly.

“The experiment concludes our overall recommendation algorithm based on the anecdotal experience of just three accounts in a one-time test with no control group or wider sampling, ” it said.

“Global Witness’s actions would likely have manipulated the outcome of the experiment by impacting the signals that shape how we recommend content,” it added.

The criticism followed Global Witness publishing a series of research papers suggesting TikTok’s algorithm amplified nationalist or far-right narratives ahead of elections in countries including Poland and Germany.

The allegations came at a time when social media platforms have been under global pressure to safeguard democratic processes and ensure fair political representation online.

“Any suggestion that TikTok favours any political party is categorically false. As we shared with Global Witness, this ‘study’ uses flawed methodology that led to incorrect conclusions that do not accurately reflect how our recommendation system actually works,” a TikTok spokesperson told Brussels Signal. 

While the studies have been widely cited in media and activist circles, TikTok argued they failed to meet basic standards of scientific rigour.

“The investigation conducted by Global Witness regarding Polish elections relies on the same flawed methodology as previous reports covering other elections,” it said.

According to TikTok, the studies, including recent ones conducted in Romania, Poland and Germany between December 2024 and June 2025, suffered from major methodological flaws.

Each relied on a small sample size – between three and nine user accounts – making the findings statistically weak, it said. The observation periods were also notably brief, lasting only 10 to 20 minutes per session, which was insufficient to capture how TikTok’s algorithm adjusted and evolved user feeds over time, the company claimed.

Additionally, none of the studies included reproducibility testing, meaning their results were not verified through repeated trials.

Despite the criticism, Global Witness stood by its research. Speaking to Brussels Signal on June 9, the organisation said: “TikTok has repeatedly failed to explain how its algorithm prioritises content during elections.

“Until TikTok improves transparency for journalists and researchers, we believe that our methodology represents the best available attempt to understand how the platform recommends political content to undecided voters ahead of elections,” it said.

In its study published on May 29, the international NGO stated: “TikTok’s algorithm is feeding new, politically balanced users twice as much far-right and nationalist right content as centrist and left-wing content.”

It argued that right-wing content tended to attract more user engagement through likes, shares and comments, due to its often provocative or polarising nature.

Global Witness implied that the platform algorithm behaviour did not reflect the actual popularity or electoral performance of the candidates, suggesting a possible bias in how political content was distributed on the platform.

In the context of the Polish election, Global Witness highlighted how the TikTok’s content delivery may have influenced voter perceptions.

The NGO based that claim on the fact that  the platform “showed our test accounts five times more content supporting nationalist right candidate Karol Nawrocki as centrist candidate Rafal Trzaskowski, despite the fact that at time of testing, the centrist candidate’s official TikTok account was more popular than his opponent’s with 12,000 more followers and nearly 1 million more likes”.

“Our results suggest TikTok has not taken sufficient action to prevent its platform prioritising right-wing content and again risks undermining the integrity of a national election,” Global Witness wrote wrote.

“For social media users, platforms are more often than not the primary way the public accesses information. This means that all content, including information related to elections, and broader social, cultural and political issues that inform how people vote, is filtered through these platforms’ recommendation system,” said a spokesperson from Global Witness’ “Digital Threats” campaign.

While the extent of social media’s influence on voting behaviour remained under debate, its role in political campaigning seemed clearer across recent European Union election cycles.

 

Global Witness maintained that TikTok’s algorithm design may inherently favour certain political perspectives, a concern echoed by others in the digital policy space.

On June 2, Gilles Babinet, co-president of France’s National Digital Council, warned that social media platforms “are incompatible with democracy,” citing how their algorithms tended to elevate extreme viewpoints and hindered constructive debate.

TikTok countered this narrative, arguing that it functioned more like a content-sharing platform, similar to YouTube, rather than a traditional social media site.