In the first days under new U.S. ownership, TikTok has stumbled into a one-two punch of technical outages and user outrage. A power failure at a stateside data center briefly knocked parts of the platform offline, and soon after, creators began reporting that posts on hot-button topics were being throttled or blocked. TikTok says the glitches are unrelated to the change in control; many users are unconvinced.
Service Disruptions After Ownership Shift
Shortly after the ownership transfer to a venture group that includes backers of Donald Trump, TikTok experienced a significant service disruption. The company attributed the outage to a power issue at a U.S. data center and said it was working with its infrastructure partner to stabilize systems. Outage trackers such as Downdetector registered a sharp spike in problem reports, ranging from failed uploads to comment errors.

By the following day, core functionality largely returned, but creators still reported unpredictable behavior when posting new videos, suggesting the platform was operating under degraded conditions—common after large-scale recoveries when caches, queues, and recommendation indexes are rebuilding.
Allegations of Suppressed Speech on TikTok
As services flickered back, prominent users accused the app of suppressing content. Posts about U.S. immigration enforcement drew particular attention: musician Billie Eilish amplified concerns on Instagram after noting minimal engagement on an anti-ICE post by Finneas, while actor-comedian Meg Stalter said TikTok blocked her from uploading similar content.
Another flashpoint: messages reportedly tripping errors when they included the word “Epstein.” CNBC said it reproduced the issue, adding fuel to claims of selective enforcement. The backdrop—new, politically connected ownership—has heightened suspicion that moderation might tilt. That perception, whether accurate or not, is already shaping how creators self-censor or choose where to publish.
What TikTok Says and What We Know So Far
TikTok has denied intentional suppression. The company told CNN that the irregularities were fallout from the data center incident, not policy changes linked to the ownership shift. Executives also told CNBC the platform should not block the word “Epstein” and that engineers were investigating the bug alongside other outage-related problems.
In public posts, TikTok said it has “made significant progress” restoring U.S. infrastructure but warned users could still experience errors when publishing new content. That caveat lines up with how large-scale platforms recover: even after uptime returns, ranking systems, integrity checks, and messaging pipelines can lag or misfire, creating the appearance of shadowbans when the root cause is backlog and rate-limiting.

How Outages Can Look Like Censorship to Users
When social networks suffer outages, their safety and recommendation stacks often revert to conservative modes. That can mean stricter spam thresholds, delayed indexing, or automated filters erring on the side of blocking. In practice, creators see stalled uploads, comments that vanish, or videos that remain undiscoverable for hours—all classic “shadowban” symptoms without a deliberate policy change.
History shows real moderation errors happen too. Major platforms have apologized for mistaken takedowns during news events and protests when automated systems overreached. The lesson is straightforward: transparency around known issues, public post-incident reports, and clear logging that lets users see if a post is delayed for technical reasons, not policy, can cool down a crisis fast.
The Stakes for Creators and Advertisers on TikTok
TikTok says around 170 million Americans use the app, a scale that turns even brief disruptions into meaningful economic shocks for creators whose income relies on predictable reach. Brand partners watch the same signals: instability raises questions about ad delivery and brand safety, while allegations of viewpoint-based suppression carry reputational risk.
Digital rights groups, including the Electronic Frontier Foundation and the ACLU, have long pushed for clearer content rules and auditability across social platforms. For TikTok’s new U.S. leadership, publishing granular incident timelines, updating enforcement transparency reports, and offering appeal visibility in real time would be a strong opening move to restore trust.
What to Watch Next as TikTok Addresses US Outages
Three indicators will tell users if the situation is stabilizing: consistent upload success rates during peak hours, normalizing engagement curves for news-adjacent content, and a concrete postmortem detailing the data center failure and downstream bugs. Independent verification helps—third-party monitoring and reproducible tests by newsrooms can validate TikTok’s explanations.
For now, the story is simple: an untimely outage collided with a politically sensitive ownership change, creating a credibility test. Whether TikTok passes will depend less on assurances and more on verifiable fixes, steady uptime, and a paper trail that shows the lights are truly back on.
