LinkedIn’s controversial decision to remove accurate Covid-19 content and suspend related user accounts was not based on the Digital Services Act (DSA) but on the platform’s own terms and conditions, according to the European Commission.
The statement contradicts earlier interpretations of a Berlin court ruling, which suggested the DSA provided legal cover for LinkedIn’s actions.
In the ruling, issued on September 18, the court repeatedly referred to the DSA as a justification its decision.
Reacting to Brussels Signal, a spokesperson for the EC emphasised that the DSA, which came into force in September 2023, does not mandate the removal of specific content.
Instead, the law requires platforms to be transparent about their moderation decisions and provide users with the right to appeal.
“From publicly available information, the DSA is not legally applicable in this case,” the spokesperson stated.
“LinkedIn’s decision to remove content and suspend the account was based on its Terms and Conditions, not the DSA.”
The spokesperson further noted that less than 1 per cent of LinkedIn’s moderation actions stem from user reports under the DSA.
Dietrich Murswiek, the lawyer of the plaintiff who was censored, told Brussels Signal: “The court based its ruling on LinkedIn’s terms and conditions but the question was if the relevant clause of these terms and conditions is in accordance with Article 14 DSA, which the court answered in the affirmative.”
He said the court’s decision was wrong and that under Article 14(4) of the DSA, platforms are explicitly required to respect freedom of expression when enforcing content restrictions.
Murswiek added that this obligation must be interpreted in line with Article 11 of the EU Charter of Fundamental Rights, which guarantees two main freedoms: That of expression and information, and freedom and pluralism of the media.
The overwhelming majority of LinkedIn’s moderation actions are driven by the platform’s internal policies, such as the Professional Community Policies, which explicitly prohibit content that contradicts guidance from major health authorities, such as the World Health Organisation.
LinkedIn’s terms and conditions state users must not share content that “directly contradicts guidance from leading global health organisations and public health authorities”, including false claims about vaccine safety or efficacy.
The platform’s policies also claim to ban misleading content, [AI-based] synthetic media or manipulated media that depicts a person saying something they did not say or doing something they did not do without clearly disclosing the fake or altered nature of the material. Content that that might improperly influence an election or other civic process is also proscribed.
The Berlin Court of Appeal’s ruling upheld LinkedIn’s right to enforce these policies, even when the content in question was factually accurate.
Legal experts and free-speech advocates have raised concerns that the decision, combined with LinkedIn’s broad discretion, could set a dangerous precedent.
A constitutional complaint has been filed with Germany’s Federal Constitutional Court, which has historically protected free expression unless information is demonstrably false.
The case could determine whether private platforms can suppress dissenting views under the guise of “misinformation”, even when those views are supported by evidence.
Critics argue this creates a two-tier system of free speech: One for institutional narratives and another for dissenting voices – even when those voices are factually correct.
A Berlin court has ruled that LinkedIn was right to block and suspend accounts that posted about Covid-19, despite the posts being correct. https://t.co/06fydb8bz7
— Brussels Signal (@brusselssignal) October 21, 2025