European regulators have made a preliminary finding that Meta and TikTok violated key transparency and user-protection requirements of the Digital Services Act, setting two of the world’s largest platforms on a collision course with the EU’s most stringent online safety law.
What the Commission Found in Its Preliminary DSA Probe
“The Commission concludes that the collaboration did not provide enough opportunities for researchers and civil society organizations (CSOs) to be exposed to public data — one of the key building blocks underpinning the goal of capturing systemic risks, such as spread of illegal content, harm to minors or interference in elections,” the European Commission said. Officials complained that the companies’ mechanisms for granting access to data and their procedures were onerous, leaving academics and civil society with only partial or unreliable datasets.

Regulators also singled out Meta’s Facebook and Instagram, saying that their interfaces make it unreasonably tough for EU users to report illegal material. It also cited “dark patterns,” or design choices aimed at nudging people away from completing reports, and appeal processes that don’t allow users to sufficiently explain or evidence their case, thus denying effective redress.
Researcher Access Is a Test of the DSA’s Core Promises
Anchoring the DSA’s vetted researcher program in Article 40 was intended to end the era of opacity of social media. It forces major online platforms — those with more than 45 million users in the European Union — to share public data that is indispensable for studying systemic risks. Independent access is especially important in the wake of high-profile incidents where a lack of transparency hampered oversight, whether around surges in disinformation or failures to prevent child exploitation.
Academics have long cautioned against eroding windows into platform behavior. Meta’s decision to shut down CrowdTangle, a tool popularly used to monitor virality of content, has met with criticism from academics, fact-checkers and election monitors across the continent of Europe. The Commission has pounced on this, too, saying its own findings indicate that existing APIs and portals are not sufficient to offer access in a timely manner or anything like a complete, stable record for meaningful analysis.
Platform Responses and Legal Tensions Under the DSA
Both companies pushed back. TikTok added that it had made “significant investments” in data access and worked with nearly 1,000 research groups through its tools. It also warned that “relaxing data sharing safeguards can clash with privacy requirements under the GDPR,” and pressed regulators to clarify how companies should balance obligations when they do.
Meta said it disagrees with the Commission’s assessment and cited changes it has made since the DSA entered into force, such as updated reporting flows, an appeals process and tools to help researchers gain access. The company argued that its products complied with EU law.

Dark Patterns And User Redress In Spotlight
The DSA outlaws manipulative interface design and requires straightforward, effective reporting points (Article 16) and internal complaint systems (Article 20). Design frictions on Meta’s services — such as extra steps, distracting prompts, or perplexing pathways — can discourage users from flagging illegal posts and hinder meaningful appeals, regulators say. If they stand, those findings would mark a move against an increasingly sophisticated array of under-the-radar UI tricks that influence user behavior without informed consent.
Consumer groups and digital rights organizations have been calling for exactly this kind of enforcement, by arguing that interface design is integral to platform accountability. (Making reporting and appeals intuitive isn’t a superficial tweak — it has everything to do with how soon illegal content gets spotted and dealt with.)
How Things Move Forward And Consequences
The findings are preliminary. Meta and TikTok can review case files, dispute the evidence and make commitments to address the problems. If the Commission ultimately rules that there were violations, penalties could be as much as 6 percent of global annual revenue in one case and binding orders to change product design and data-access practices in both cases.
For TikTok, the investigation builds on broader scrutiny of advertising transparency, care of children and content moderation. The investigation represents a juxtaposition for Meta between questions of election integrity on Facebook and Instagram, where researcher visibility and quick user reporting are seen as critical protections.
A Broader Enforcement Wave Is Building Across the EU
The EU has made clear that DSA enforcement is about to get tougher, with several proceedings aimed at reducing systemic risk, increasing recommender transparency and ending the use of dark patterns. The law is complementary to the GDPR’s privacy regime, and will work in tandem with another sweeping law that takes aim at gatekeeper power, the Digital Markets Act — together comprising a dense entwinement of compliance obligations for global tech platforms operating on this continent.
Whatever the cases decide, one thing is crystal clear: the time of voluntary transparency has passed. People will measure platforms not by their policies alone, but by actual researcher access, how easy it is to report incidents (with minimal friction for users) and appeal mechanisms that work in practice. That bar is higher now, and failure to meet public health guidelines comes with real financial and operational risk.