€120 Million Fine for X Sends a Political Signal – But Fails to Protect Democracy
Dear friends!
This is my message of the day: For the first time, the EU Commission has used the teeth of the Digital Services Act (DSA). I have been pushing for this for years!
You can find all the details in my press release below.
With determined regards,
Yours, Alexandra Geese
—
Press Release: €120 Million Fine for X Sends a Political Signal – But Fails to Protect Democracy
Brussels, 5 December 2025 – Today, the European Commission issued its first formal non-compliance decision under the Digital Services Act (DSA) against the platform X (formerly Twitter), imposing a fine of EUR 120 million. The Commission finds serious infringements: the misleading use of the “blue checkmark”, major shortcomings in advertising transparency, and the systematic obstruction of data access for researchers.
For Alexandra Geese, MEP for the Greens/EFA Group, this decision is a necessary first step – but politically incomplete:
“This is an important start, but not a breakthrough. As long as the Commission fails to rule on the algorithms, the central lever of manipulation remains untouched. Anyone who, like Elon Musk with X, systematically relies on deception, manipulation and opacity must not be allowed to treat this as a calculable cost of doing business.”
Already in January 2025, the Commission launched additional investigatory measures into X’s recommender systems, including requests for internal documents on algorithm changes and access to technical interfaces. These investigations concern the suspicion that the algorithm systematically amplifies certain political content while suppressing others. A decision on these allegations is still pending.
Geese sharply criticises the absence of this second, central decision:
“The Commission is dodging the decisive question of power: the algorithms. The systemic risk for elections and democratic discourse under Article 34 DSA is being left out, even though analyses of elections in Germany, Poland and the United Kingdom clearly show that X manipulates political reach, distorts election campaigns and systematically boosts certain opinions while suppressing others.”
(Note: Studies demonstrating political bias on X are listed below.)
This regulatory restraint comes at a time of intense geopolitical pressure on European platform regulation. Geese also addresses this with clear words:
“When the US Vice President attacks the EU, Elon Musk applauds him in public, and at the same time threats are made of visa harassment against fact-checkers and of using trade tariffs to blackmail Europe, this is no longer a debate about freedom of expression. This is coordinated geopolitical pressure. It is about controlling algorithms as a political instrument of power in order to manipulate elections and prepare regime change in Europe. The Commission is still ignoring exactly this systemic risk under Article 34 DSA. I expect it to adopt a separate, tough decision on this without delay.”
Deception through the blue checkmark: a systematic abuse of trust
The Commission finds that X misleads users about the alleged “verified status”: the blue checkmark suggests authenticity and trustworthiness, even though no genuine identity verification takes place. This deliberately facilitates fraud, identity theft and political manipulation. Geese explains:
“The blue checkmark is not a design flaw, it is a fraud feature. X sells credibility to anyone who pays, whether troll, scammer or propagandist. Trust is turned into a commodity and manipulation into normality. This opens the door wide to fraud, coordinated disinformation and election meddling.”
Advertising transparency effectively dismantled
Geese is particularly concerned that X’s advertising repository is practically unusable: it lacks information on sponsors, content and targeting, and erects deliberate access barriers for the public and for researchers:
“Without effective advertising transparency, there can be no democratic control of digital election interference. X is deliberately sabotaging scrutiny of political ads, disinformation campaigns and hybrid threats. This is not a technical failure, it is political irresponsibility.”
Blocking research access prevents exposure of systemic risks
The Commission also confirms that X systematically denies researchers the legally required access to public platform data, including through contractual clauses and technical barriers:
“Those who block research do not want transparency, they want ignorance. X is deliberately preventing the societal harm caused by its algorithms from being scientifically documented. This is a direct attack on evidence-based democracy policy.”
Now we find out whether the DSA has teeth
Geese is highly critical of the fact that X now has up to 90 working days to submit action plans. After years of systematic law-breaking, she believes there must be no more benefit of the doubt:
“X has not broken the law once, but for years. What we need now is not another action plan on paper, but immediate enforcement: daily penalty payments, tough deadlines and, if necessary, direct interventions in the recommender algorithms.”
In conclusion, Geese underlines the fundamental significance of today’s decision:
“The DSA was created for exactly these kinds of situations. Today, Europe has shown that it can act – but not yet that it can enforce. Our democracy is worth more than a symbolic fine for one of the most powerful men in the world.”
BACKGROUND
The European Commission opened formal proceedings against X under the Digital Services Act on 18 December 2023. On 12 July 2024, it issued preliminary findings concerning deceptive design, lack of advertising transparency and insufficient data access for research. On 17 January 2025, the Commission also launched additional investigatory measures into X’s algorithm changes, focusing in particular on potential distortions caused by recommender systems.
With today’s decision, the Commission has issued its first formal non-compliance decision under the DSA and imposed a fine of EUR 120 million. A decision on systemic risks under Article 34 DSA is still outstanding.
Studies demonstrating political bias on X:
- The X effect. How Elon Musk is boosting the British Right. https://news.sky.com/story/the-x-effect-how-elon-musk-is–boostingthe-british-right-13464487
- Political biases on X before the 2025 German Federal Election. Tabia Tanzin Prama, Chhandak Bagchi, Vishal Kalakonnavar, Paul Krauß, Przemyslaw A. Grabowicz. https://arxiv.org/abs/2503.02888
- Algorithmic Biases on X before the 2025 Polish Presidential Election, Tabia Tanzin Prama, Vishal Kalakonnavar, Przemyslaw A. Grabowicz, https://zenodo.org/records/17512529
- Global Witness (2025). TikTok and X recommend pro-AfD content to non-partisan users ahead of the German elections. https://globalwitness.org/en/campaigns/digital-threats/tiktok-and-x-recommend-pro-afd-content-to-non-partisan-users-ahead-of-the-german-elections/?gad_source=1&gad_campaignid=16998328683&gclid=EAIaIQobChMIhviKj5nqkAMVRK6DBx23IzjLEAAYASAAEgICL_D_BwE
- Ye, Luceri & Ferrara (2025). Auditing Political Exposure Bias: Algorithmic Amplification on Twitter/X During the 2024 U.S. Presidential Election. In Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (pp. 2349-2362). https://dl.acm.org/doi/full/10.1145/3715275.3732159
- Huszár et al. (2022). Algorithmic amplification of politics on Twitter. In Proceedings of the National Academy of Sciences 119(1), https://www.researchgate.net/publication/357230555_Algorithmic_amplification_of_politics_on_Twitter
Verwiebe et al. (2025). Digitalisiert, politisiert, polarisiert? Eine Analyse von Social-Media-Feeds junger Menschen zur Bundestagswahl 2025 auf TikTok, YouTube, Instagram und X. Bertelsmann Stiftung. https://www.bertelsmann-stiftung.de/fileadmin/files/user_upload/Digitalisiert_politisiert_polarisiert.pdf (in German)