The European Commission has delivered a sharp rebuke to TikTok, saying the platform’s core design fuels compulsive use and must change. In preliminary findings under the Digital Services Act, regulators called out infinite scroll, autoplay, push notifications, and the recommendation engine, and told the company to rework the app’s “basic design” to reduce risks to minors and vulnerable users.
The Commission argues TikTok failed to adequately evaluate how its features affect well-being, flagging signals such as late-night watch time and frequent app opens as underweighted in its own safety assessments. Brussels wants concrete friction added: disable infinite scroll, introduce regular screen-time breaks that are hard to dismiss, and modify the recommender so it no longer nudges users into endless viewing loops.
- What Brussels Found Under The DSA For TikTok’s Design
- The Features Under Fire: Scroll, Autoplay, Alerts
- TikTok Pushes Back Against EU’s Preliminary Findings
- Why The Recommendation Engine Matters For Safety
- A Global Shift On Youth Safety And App Design Risks
- What Comes Next In The EU’s Case Against TikTok

What Brussels Found Under The DSA For TikTok’s Design
As a designated Very Large Online Platform, TikTok faces strict DSA obligations to identify and mitigate systemic risks, including those related to minors’ mental health. The Commission’s view is that the app’s reward structure—constantly delivering fresh clips tailored to micro-signals—encourages continued scrolling rather than mindful stopping.
Regulators also criticized existing safeguards. Time-limit prompts are easy to dismiss, they said, and parental controls demand time and know-how many families lack. Effective mitigation, in the Commission’s framing, means adding real friction: forced breaks after defined intervals, default settings that curb autoplay and push alerts at night, and recommendation changes that reduce exposure to compulsive loops.
The Features Under Fire: Scroll, Autoplay, Alerts
Infinite scroll and autoplay are powerful engagement engines because they remove “stop cues,” keeping content flowing without a deliberate decision to continue. Push notifications add urgency that draws people back, often at times when they would otherwise disengage. Consumer advocates, including the Norwegian Consumer Council in its “manipulation” research, have repeatedly warned that such patterns can steer users into longer sessions than intended.
Platforms have off-ramps they can deploy. Examples include visible “end-of-feed” markers instead of bottomless feeds, mandatory pause screens after a set number of minutes, and notification batching that limits nighttime interruptions. The Commission’s findings imply that TikTok’s current implementation of these ideas—where they exist—does not introduce sufficient friction to meaningfully change behavior.
TikTok Pushes Back Against EU’s Preliminary Findings
TikTok rejected the assessment, calling it categorically false and pledging to challenge the findings. The company points to an array of tools it already offers, including daily screen-time limits, usage dashboards, bedtime reminders, restricted modes, and Family Pairing controls that let caregivers adjust teens’ settings.
Regulators remain unconvinced that these tools are effective in practice. Their critique lands amid growing evidence of heavy youth engagement on short-form video. Pew Research Center has reported that a significant share of U.S. teens use TikTok, with a notable minority saying they use it “almost constantly,” underscoring concerns about compulsive patterns and attention costs.

Why The Recommendation Engine Matters For Safety
The “For You” feed is TikTok’s growth engine, finely tuned to maximize watch time through rapid feedback signals—rewatches, short rewinds, likes, dwell time, and more. Under the DSA, platforms must offer at least one recommender option not based on profiling; in practice, that can mean a chronological or non-personalized feed, or a “Following” view that doesn’t use inferred interests.
The Commission’s push goes further, suggesting TikTok should alter ranking choices that incentivize late-night binging and rapid-fire novelty. Expect calls for transparency around which signals drive recommendations, limits on how sensitive behavioral data can be used, and default options that nudge toward healthier consumption patterns rather than maximal stickiness.
A Global Shift On Youth Safety And App Design Risks
Europe’s move aligns with broader efforts to restrict minors’ exposure to high-engagement designs. Governments in Australia have pushed platforms to tighten age controls, while authorities in the U.K. and Spain have explored stronger guardrails for teen use. Several European countries have advanced age-restriction policies, and in the United States, numerous states have passed or proposed age verification and parental consent laws for social media access.
Litigation is climbing as well. TikTok recently settled a high-profile case in the U.S. tied to claims of social media addiction, highlighting the legal risks when product design is alleged to harm young users. In the EU, confirmed DSA breaches can trigger fines of up to 6% of a company’s global annual turnover, with tougher measures possible for persistent noncompliance.
What Comes Next In The EU’s Case Against TikTok
TikTok now has an opportunity to respond formally to the Commission’s preliminary findings. The process can lead to binding decisions, negotiated commitments, or sanctions, depending on the company’s proposed remedies and the regulator’s assessment of their effectiveness.
Given the DSA’s reach, any TikTok redesign for EU users—hard stops, toned-down notifications, or a less aggressive recommender—could ripple globally as a new baseline. That is the strategic question for ByteDance: accept friction that curbs stickiness in Europe, or fight the judgment and risk both penalties and reputational damage in one of the app’s most scrutinized markets.
