OpenAI has secured $110 billion in private funding, a staggering sum that places the company among the most heavily backed startups ever. The round includes $50 billion from Amazon and $30 billion each from Nvidia and SoftBank, set against a $730 billion pre-money valuation. The company said the round remains open and that additional investors may join.
OpenAI framed the raise as fuel for an urgent infrastructure race, arguing that leadership in frontier AI now hinges on who can scale compute and convert it into dependable, widely used products. As with prior raises, a meaningful portion of the headline figure is expected to reflect services and long-term capacity commitments alongside cash, though the exact split was not disclosed.

A New Scale for Private Capital in AI Financing
The magnitude of this round redefines the ceiling for private AI financings. OpenAI’s previous raise closed at $40 billion against a $300 billion valuation, then a record. By comparison, global venture records tracked by CB Insights and PitchBook have historically topped out in the low tens of billions, with Ant Group’s $14 billion financing often cited as a high-water mark. A $110 billion round resets the norms for capital intensity in AI.
It also underscores how competitive dynamics have shifted from model quality alone to sustained access to chips, data centers, and distribution. While rivals have announced multi-billion-dollar rounds, none approach this scale, signaling a consolidation of resources around platforms that can secure compute at industrial levels.
Amazon Partnership Deepens Around AI Infrastructure
Anchoring the raise is an expanded alliance with Amazon. OpenAI plans to build a “stateful runtime environment” for its models on Amazon’s Bedrock platform, while extending its existing AWS agreement by an additional $100 billion in compute services. As part of the pact, OpenAI has committed to consuming at least 2 GW of AWS Trainium compute and will develop custom models to power Amazon consumer products.
“We have lots of developers and companies eager to run services powered by OpenAI models on AWS,” said Amazon CEO Andy Jassy, adding that the collaboration around stateful runtimes will “change what’s possible for customers building AI apps and agents.”
The Information previously reported that a further $35 billion of Amazon’s investment could be contingent on milestones such as achieving AGI or pursuing an IPO. OpenAI confirmed the funding split and said only that the additional $35 billion would arrive “in the coming months when certain conditions are met,” implying Amazon’s total commitment could rise as milestones are achieved.
Nvidia and SoftBank Bet on Scale and Capacity
OpenAI disclosed fewer specifics on Nvidia’s participation, but said it has committed to 3 GW of dedicated inference capacity and 2 GW of training on “Vera Rubin” systems as part of the deal. Nvidia’s stake follows months of speculation about the size of its backing; CEO Jensen Huang recently reiterated confidence, saying, “we will invest a great deal of money. I believe in OpenAI. The work that they do is incredible.”

SoftBank’s $30 billion bet aligns with its broader thesis that AI workloads will drive demand from data center silicon to edge devices. With control of Arm, SoftBank sits at the center of CPU roadmaps that underpin cloud and mobile AI, suggesting potential synergies as inference expands beyond hyperscale environments.
What the Money Buys: Guaranteed Compute Access
Beyond headline valuations, this financing is fundamentally about guaranteed access to compute. Measuring commitments in gigawatts highlights a new reality: AI capacity is now constrained not just by chips but by power, cooling, and network fabric. The services-heavy structure helps lock in priority lanes across these bottlenecks.
Energy and siting are pivotal. The International Energy Agency has warned that electricity use from data centers is on track to surge this decade, and industry groups like the Uptime Institute have flagged constraints around grid interconnects and water availability. To deliver on promised capacity, OpenAI and its partners will need to pair advanced silicon with low-cost power, efficient cooling, and high-throughput networking at unprecedented scale.
Valuation and What to Watch in the Months Ahead
A $730 billion pre-money valuation implies expectations of rapid revenue expansion from API usage, enterprise licenses, agents, and consumer subscriptions. The open nature of the round leaves room for additional strategic investors—potentially across cloud, telecom, and semiconductor ecosystems—seeking preferred access to models and tooling.
Regulators in the US, EU, and UK have already scrutinized AI-cloud tie-ups, so the deepening of relationships across compute, distribution, and services will likely attract attention. Meanwhile, execution milestones are clear: stand up the new runtime on Bedrock, scale Nvidia-backed inference and training capacity, and translate infrastructure guarantees into reliable latency, uptime, and cost profiles for developers.
However it ultimately closes, this is a watershed financing. It signals that the AI stack—chips, power, and platforms—is now a capital project on the order of global telecom and cloud buildouts. The next test begins immediately: turn balance-sheet commitments into durable product advantages and defensible margins at world scale.
