The European Commission has imposed its first penalty under the Digital Services Act, fining X €120 million for what regulators described as a deceptive blue check verification system and wider failures in transparency. The decision sets a precedent under the European Union’s far-reaching platform law and will pile pressure on X for overhauls in the way it categorizes accounts, discloses ads, and sanctions researchers’ access to public data.
What prompted the EU’s action against X under the DSA
At the core of the case is X’s transition from an identity-known badge to a paid version that looks just like its old “verified” check. The Commission found that representing a paid status with visual elements which were historically used to signal identity verification lures users — a design practice that the DSA campaigns against as misleading. Regulators contend this blurring heightens the risk of scams, impersonation, and manipulation.
- What prompted the EU’s action against X under the DSA
- Why the blue check redesign became a source of risk
- Transparency in advertising and researcher data access
- What the DSA demands of very large online platforms
- What’s next for X after the EU’s first DSA enforcement
- Here’s why the decision matters for platforms beyond X

The ruling goes further than that, however, by arguing that X’s archive of ads is missing critical specifics needed for public examination — like consistent data on advertisers and ad content — and places unjustified time limits on access. The Commission also criticized X’s procedures related to how researchers could access public data, claiming the company has created unnecessary obstacles preventing independent examination of systemic risks (a key requirement for the largest platforms under the DSA).
Why the blue check redesign became a source of risk
As an identity verification service for public figures, journalists, and officials, a blue check had long been shorthand for legitimacy. When the visual signal started being sold as a subscription, the sign stayed the same but its meaning changed. This is the very kind of design decision that the DSA aims to combat: interfaces which encourage users into false assumptions of authenticity.
The risks are not theoretical. After paid badges were introduced, high-profile imposter accounts abounded, including a fake one claiming to be a major drug company and advertising “free insulin,” which caused its market value to briefly shed billions of dollars. European regulators, concerned about election integrity and financial fraud, look to such episodes as proof that it is dangerous to let misleading verification cues turbocharge real-world harm.
Transparency in advertising and researcher data access
The DSA mandates that big online platforms keep timely, searchable, and rich-metadata ad libraries: who paid for an ad, who was targeted, what was shown, and why. On several of these dimensions, X’s repository is lacking and requests take too long given what the Commission says the public should be able to do to “assess influence operations, electioneering messaging, or deceptive commercial practices.”
Research access is equally pivotal. The law requires vetted researchers to receive access to public data to study systemic risks such as disinformation, gender-based violence, and threats to public health. The Commission argues that X’s approval processes and rate limits, combined with contractual barriers, prevent that same access and thus breach the DSA’s transparency framework.
What the DSA demands of very large online platforms
The DSA sets a hard threshold for “very large” platforms — those with more than 45 million users in the EU. Responsibilities feature prohibiting misleading design, conducting annual risk assessments, adopting mitigations, and opening the ad library and public data access to scrutiny as well as independent audits. Refractory firms can be fined up to 6% of global annual turnover, and if that doesn’t work, periodic penalty payments of 5% of average daily worldwide turnover until fixes are made.

This initial fine is a signal that design decisions can be fair game, not just content takedowns or back-end policies. It also confirms that transparency is an operational requirement: ad libraries and access to data should operate at the level that would allow them to be externally audited, not merely on paper.
What’s next for X after the EU’s first DSA enforcement
X has been directed to outline how it will tackle the blue check issues in 60 days and provide an action plan on ad transparency and researcher access in 90 days. If its remedial actions are insufficient, additional sanctions or compliance orders can be ratcheted up by the Commission. X can appeal the decision before the EU’s General Court — and appeals don’t automatically put compliance responsibilities on hold.
Practically, solving this verification problem probably involves clearer, independent signals for identity and subscription value, explicit name labeling to decouple from legacy confusion, and strong identity checks where “verification” implies authenticity. On transparency, X would need to begin by making ad metadata more available, cutting processing waits, and simplifying researcher access without unreasonable obstruction.
Here’s why the decision matters for platforms beyond X
The decision serves as a warning for all listed platforms: In the eyes of the law, paid-for features that imitate trust signals, slim ad libraries, and limited research access are liabilities.
Competitors that already split paid perks from identity verification — including some doing government ID and liveness checks — may have models more in line with the DSA’s expectations.
For policymakers and civil society, this case is a litmus test: Does the promise of the DSA — more accountability in how platforms design and operate themselves — actually materialize? For users, it could lead to less muddying confusion from badges, more trustworthy accounts, and clearer information on who’s paying to get a hold of them. And the lesson for platforms is clear: design and transparency are now mandatory compliance obligations, not optional best practices.