FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

EU Finds Meta And TikTok Breached DSA Transparency Rules

Gregory Zuckerman
Last updated: October 26, 2025 5:44 pm
By Gregory Zuckerman
Technology
7 Min Read
SHARE

European regulators have made a preliminary finding that Meta and TikTok violated key transparency and user-protection requirements of the Digital Services Act, setting two of the world’s largest platforms on a collision course with the EU’s most stringent online safety law.

What the Commission Found in Its Preliminary DSA Probe

“The Commission concludes that the collaboration did not provide enough opportunities for researchers and civil society organizations (CSOs) to be exposed to public data — one of the key building blocks underpinning the goal of capturing systemic risks, such as spread of illegal content, harm to minors or interference in elections,” the European Commission said. Officials complained that the companies’ mechanisms for granting access to data and their procedures were onerous, leaving academics and civil society with only partial or unreliable datasets.

Table of Contents
  • What the Commission Found in Its Preliminary DSA Probe
  • Researcher Access Is a Test of the DSA’s Core Promises
  • Platform Responses and Legal Tensions Under the DSA
  • Dark Patterns And User Redress In Spotlight
  • How Things Move Forward And Consequences
  • A Broader Enforcement Wave Is Building Across the EU
The TikTok logo, a white musical note with cyan and red shadows, centered on a dark gray background with subtle diagonal patterns.

Regulators also singled out Meta’s Facebook and Instagram, saying that their interfaces make it unreasonably tough for EU users to report illegal material. It also cited “dark patterns,” or design choices aimed at nudging people away from completing reports, and appeal processes that don’t allow users to sufficiently explain or evidence their case, thus denying effective redress.

Researcher Access Is a Test of the DSA’s Core Promises

Anchoring the DSA’s vetted researcher program in Article 40 was intended to end the era of opacity of social media. It forces major online platforms — those with more than 45 million users in the European Union — to share public data that is indispensable for studying systemic risks. Independent access is especially important in the wake of high-profile incidents where a lack of transparency hampered oversight, whether around surges in disinformation or failures to prevent child exploitation.

Academics have long cautioned against eroding windows into platform behavior. Meta’s decision to shut down CrowdTangle, a tool popularly used to monitor virality of content, has met with criticism from academics, fact-checkers and election monitors across the continent of Europe. The Commission has pounced on this, too, saying its own findings indicate that existing APIs and portals are not sufficient to offer access in a timely manner or anything like a complete, stable record for meaningful analysis.

Platform Responses and Legal Tensions Under the DSA

Both companies pushed back. TikTok added that it had made “significant investments” in data access and worked with nearly 1,000 research groups through its tools. It also warned that “relaxing data sharing safeguards can clash with privacy requirements under the GDPR,” and pressed regulators to clarify how companies should balance obligations when they do.

Meta said it disagrees with the Commission’s assessment and cited changes it has made since the DSA entered into force, such as updated reporting flows, an appeals process and tools to help researchers gain access. The company argued that its products complied with EU law.

EU finds Meta and TikTok breached Digital Services Act transparency rules

Dark Patterns And User Redress In Spotlight

The DSA outlaws manipulative interface design and requires straightforward, effective reporting points (Article 16) and internal complaint systems (Article 20). Design frictions on Meta’s services — such as extra steps, distracting prompts, or perplexing pathways — can discourage users from flagging illegal posts and hinder meaningful appeals, regulators say. If they stand, those findings would mark a move against an increasingly sophisticated array of under-the-radar UI tricks that influence user behavior without informed consent.

Consumer groups and digital rights organizations have been calling for exactly this kind of enforcement, by arguing that interface design is integral to platform accountability. (Making reporting and appeals intuitive isn’t a superficial tweak — it has everything to do with how soon illegal content gets spotted and dealt with.)

How Things Move Forward And Consequences

The findings are preliminary. Meta and TikTok can review case files, dispute the evidence and make commitments to address the problems. If the Commission ultimately rules that there were violations, penalties could be as much as 6 percent of global annual revenue in one case and binding orders to change product design and data-access practices in both cases.

For TikTok, the investigation builds on broader scrutiny of advertising transparency, care of children and content moderation. The investigation represents a juxtaposition for Meta between questions of election integrity on Facebook and Instagram, where researcher visibility and quick user reporting are seen as critical protections.

A Broader Enforcement Wave Is Building Across the EU

The EU has made clear that DSA enforcement is about to get tougher, with several proceedings aimed at reducing systemic risk, increasing recommender transparency and ending the use of dark patterns. The law is complementary to the GDPR’s privacy regime, and will work in tandem with another sweeping law that takes aim at gatekeeper power, the Digital Markets Act — together comprising a dense entwinement of compliance obligations for global tech platforms operating on this continent.

Whatever the cases decide, one thing is crystal clear: the time of voluntary transparency has passed. People will measure platforms not by their policies alone, but by actual researcher access, how easy it is to report incidents (with minimal friction for users) and appeal mechanisms that work in practice. That bar is higher now, and failure to meet public health guidelines comes with real financial and operational risk.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
Instagram adds Reels watch history to revisit seen videos
Alibaba Launches Quark AI Glasses As Competition For Meta Ray-Bans
Wyze Debuts Battery Doorbell at Over $100 Less Than Nest
Why Agentic Browsers Are Especially Vulnerable
Arbor Adds Natural Gas To Rocket Engine Power Plant
Surfshark Surfaces With Early $2.19 VPN And Antivirus Deal
Microsoft Teams Will Auto Update Office Location
Intel Warns of CPU Shortages as Demand Booms
Soundcore P20i Earbuds Drop to $19.98 in Major Sale
Tests Find ChatGPT Browser Leaves Sensitive Data
Automattic Undercutting WP Engine Over WordPress Trademarks
Home Depot Puts 12 Foot Skeleton On Sale For First Time
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.