TikTok’s long-awaited U.S. restructuring is shifting from possibility to reality. A group of investors has been chosen to take over control of TikTok, and it will retrain the platform’s recommendation algorithm to be based on data from American users — a crucial move intended in part to satisfy national security concerns and make the app available for its tens of millions of U.S. users.
Under the proposal, a new venture, called TikTok USDS Joint Venture LLC, will gain big oversight over U.S. operations, according to an internal memo seen by Reuters and The Hollywood Reporter. The consortium — which includes U.S. tech group Oracle, private equity firm Silver Lake, and Abu Dhabi’s Mubadala Investment Company through its subsidiary company MGX — will own 50% of the American business between them. All three of those investors will own 15%, and another 5% will go to other investors. Affiliates of current ByteDance investors will own 30.1 percent, and ByteDance will keep 19.9 percent.

The memo also says that the joint venture will control how content is moderated, oversee deployment and integrity of TikTok’s software, and manage U.S. user data in a secure cloud environment run by Oracle. The framework is an evolution of previous “Project Texas” protections but introduces a significant governance shift: the algorithm that drives what Americans see in their For You feed will be retrained and overseen under the new U.S. venture.
Why Retraining the U.S. Algorithm Matters
Recommendation systems are TikTok’s engine. This is algorithmic sovereignty retraining: showing that rankings and distribution are independent of external interference. Lawmakers and national security officials have long called for precisely this sort of ring-fenced control, and the company’s pledge tracks with what regulators like CFIUS have sought for years: verifiable constraints on outside influence over whatever content trends in the U.S.
Retraining is, technically speaking, not a matter of flipping a switch. The model will be either forked or rebuilt on a purely domestic corpus, and then fine-tuned using real-time reinforcement learning as well as large-scale A/B tests. Any change in training data or policy weights could affect watch-time, session-length, and creator-reach. Platforms that have adjusted core ranking systems — think YouTube’s pivots on watch time and shorts — have historically weathered short-term turbulence as metrics rebalanced. Look for a conservative, tiered rollout that will prevent users’ feeds from being hit with cold-start shock.
What It Means for U.S. Users and Creators on TikTok
For users, the For You feed may seem slightly different as the model refines its understanding of topics from a U.S.-only training set and updated policy rules. The immediate goal is stability: minimizing the interference of habitual viewing patterns on the platform’s ability to suppress bad behavior. The feed might slowly skew back toward U.S.-local signals over time — regional news, sports, music, and culture — if those cues become more heavily weighted in the retrained model.
Creators should watch analytics closely for early signs. And if the platform favors new integrity checks, some categories may experience delayed discovery as systems coalesce. Previous disclosures from TikTok and competing platforms indicate that post-labeling, viewer feedback, and the policy classifiers can slightly dampen reach in cases of low confidence at this stage, normalizing as the model trains.

Advertisers, meanwhile, have called for a more distinct division of data and tougher brand-safety controls. Industry groups like the Global Alliance for Responsible Media have called for standards; it would be easier to prove you’re complying if the model has been retrained and moderation oversight is centralized. Market researchers like eMarketer have noted a persistent increase in demand for TikTok ads, and this governance revamp will reassure buyers who are considering multiyear deals.
Data and security controls for the U.S. TikTok venture
The joint venture’s data stance revolves around Oracle’s U.S. cloud to store and process data from the United States. That residency layer — combined with gated access paths and auditability — is intended to block cross-border data flows and ring-fence source code governance. Prior third-party security evaluations via Project Texas set a foundation; the updated ownership and accountability model raises the bar by putting data control, algorithm stewardship, and policy enforcement in one U.S.-governed stack.
Notably, the memo states that the U.S. venture will be reviewing and approving content moderation policies. That hands the consortium a lever not only over infrastructure but also over the signals that it feeds to the algorithm — what gets boosted, what gets demoted, and when intervention is called for. Transparency reports and independent testing, often called for by digital rights groups and academic researchers, will be crucial to help show the system works as advertised.
What Comes Next for TikTok’s Proposed U.S. Venture
But there are several hurdles to overcome, including entering into definitive agreements and getting regulatory approval, as well as technical migration. On the product side, leaders also haven’t said whether a standalone app will be necessary when the new venture files for full control. If past major platform transitions are any indication, look for continuity from an end-user perspective among some behind-the-scenes changes before policy and algorithmic updates roll out after a retrained model is confirmed at scale.
TikTok had already announced that more than 150 million people in the United States use the service, a level of reach that allows little margin for error. If the installation is steady, the arrangement suggests a blueprint for reconciling platform dynamism with national security expectations: local data, local governance, and a recommendation engine rebuilt to match the regulatory moment.
