FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Business

OpenAI’s $13B dash to the finish and what comes next

Gregory Zuckerman
Last updated: October 15, 2025 6:05 am
By Gregory Zuckerman
Business
8 Min Read
SHARE

OpenAI is making real money today and preparing for spending that will be all but inconceivable tomorrow. The work is paying off: The Financial Times reports the company is on a roughly $13 billion run rate, with around 70% of that coming from consumers who pay $20 per month for ChatGPT. At the same time, OpenAI has informed partners that it plans to help fund and use more than $1 trillion in artificial intelligence infrastructure over the next decade. The company’s own sense of urgency is palpable: it has about five years to transform a success in chat apps into something with multiple revenue streams and the scale of infrastructure.

The math of trillion-dollar scale and revenue realities

Come what may, however many consumer subscribers you can sign to your network, that alone won’t foot the bill for a trillion-dollar buildout. At $20 a month, a billion paying users would be worth $240 billion a year — a mind-blowing figure, but still only one small piece of the total capital that OpenAI and its partners will spend across data centers, chips, and power. For context, Apple clocked about $383 billion in revenue last fiscal year and Microsoft was approaching $250 billion, according to their SEC filings.

Table of Contents
  • The math of trillion-dollar scale and revenue realities
  • New lines of business clock in as OpenAI diversifies
  • Compute and power emerge as the bottleneck to AI growth
  • Enterprise stickiness and the platform play OpenAI pursues
  • Risks if the plan slips amid partners, regulation, and markets
  • What to watch over the next five years as OpenAI scales
OpenAI $13B funding race, AI investment strategy and future plans

The divide also helps explain OpenAI’s multi-lane strategy. Enterprise contracts of higher value, scaled API consumption, and new premium products will also have to lift average revenue per user, while addressing gross margins squeezed by compute costs. Margin matters when you have expensive inference and most usage does not run on its cloud.

New lines of business clock in as OpenAI diversifies

From government contracts to shopping tools, video services, consumer devices, and even selling compute — it’s a five-year plan, per the Financial Times.

This is not a matter of blue-sky brainstorming; it’s an effort to diversify away from one single subscription and API model that rides the volatility of GPU supply and cloud pricing.

Enterprise demand is real. PwC said it had a mass deployment planned for its workforce, Morgan Stanley put an AI assistant in the hands of thousands of financial advisors, and Klarna reported that now most customer service chats are answered by its AI agent — lowering costs and improving response times, according to the company. Those are the kinds of proof points that OpenAI can use to convert the pilots into long-term, multimillion-dollar contracts with six- and seven-figure annual commitments.

Video is another lever. OpenAI’s Sora demos bring brand advertising, entertainment previsualization, and synthetic content creation to the game. If priced and scaled correctly, high-value video generation could create a new class of revenue, much as cloud GPU rendering has become its own business.

Hardware may follow. The Information and the Financial Times have reported on OpenAI’s exploratory work on consumer devices, including potential partnerships with industrial design veterans. If a custom AI-first gadget can solve trusted input, privacy, and on-device inference beautifully, it would support services revenue the way smartphones are to the app ecosystem.

Compute and power emerge as the bottleneck to AI growth

The issue is not demand but supply. The FT says there is over 26 gigawatts of computing power lined up as part of the deals, which include those with Oracle, Nvidia, and AMD among others. That’s utility-scale power — approximately that of two dozen big power plants — being used for AI. The International Energy Agency’s estimate that global data center electricity consumption could double by mid-decade underlines the strain on grids and a scramble to secure long-term power contracts.

OpenAI $13B funding race to the finish line and what comes next

OpenAI’s “Stargate” data center, as reported by the Wall Street Journal and others, is a signal that vertically integrated compute is an anti-competitive moat — co-designing chips, cloud networking, and cooling to drive cost per token down. Custom silicon, which could be co-developed with partners such as Broadcom and manufactured by TSMC, may reduce unit costs and expand margins, but it is also orders of magnitude more costly upfront and has practically no room for error.

Enterprise stickiness and the platform play OpenAI pursues

To justify an infrastructure like that, OpenAI doesn’t get to be a product; it has to become a platform. That entails deeper SDKs, agent frameworks that can automate processes end to end, and what it calls app store economics in which third parties build and profit from it. The GPT Store is the low-hanging fruit — the real opportunity is in enterprise-grade tooling (governance, security, monitoring, and customization) where you start to rack up switching costs.

Competition is intensifying. Meta’s Llama family and startups like Mistral are pushing powerful open models; cloud providers are offering bundles of their own foundation models at tempting prices. OpenAI’s value proposition will have to be model quality, reliability, safety, and total cost of ownership delivered with predictable SLAs — things enterprises will pay for at scale when AI touches regulated data and mission-critical processes.

Risks if the plan slips amid partners, regulation, and markets

The FT notes that several of America’s most valuable companies now rely on OpenAI for significant contracts. That kind of interdependency presents a new sort of concentration risk. If those performance, uptime, or cost curves prove disappointing, the effects could reverberate throughout partner roadmaps — and potentially cause aftershocks in public markets. Regulatory moves on data use and model training might also change the economics overnight.

There’s partner risk, too. OpenAI is deeply dependent on Microsoft’s Azure for its compute and distribution. The relationship has worked for both sides, but it complicates margin control and strategic independence: it will be even more challenging to remain independent if OpenAI also becomes a compute supplier. Transparent statements regarding cost sharing, capex, and long-term purchase commitments will be important to maintain investor confidence.

What to watch over the next five years as OpenAI scales

Leading indicators include the following:

  • The percentage of regular users who pay compared to 6–9 months ago
  • ACV for ChatGPT Enterprise and API customers
  • Gross margin trends as new silicon arrives
  • Specific Stargate financing, power purchase agreements, and grid interconnection milestones
  • Government certifications and sector-specific models
  • Whether OpenAI can monetize video and agents at large scale

Turning $13 billion into the capacity to support $1 trillion of infrastructure isn’t a straight-line plan; it’s a portfolio of bets on lower unit costs and higher-value software. The clock ticks, but in AI, cost curves and capabilities can shift more quickly than one might assume. What OpenAI does over the next five years and beyond may determine whether a consumer-first phenomenon can evolve into the backbone of an era of computing.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
Coinbase Upgrades CoinDCX Valuation To 2.45 Billion
Samsung to Unveil Project Moohan at a Galaxy Event
Immutable Linux Security: The 5 Best Solutions Available
How AI Winners Get the Edge and Sustain Advantages
Five Nmap security tips and how to use them
Dan And Phil And The Future Of Parasocial Ties On YouTube
Dreame Aqua10 Ultra Roller First Week Review
Samsung to Launch Vision Pro Rival: How to Watch
Samsung Galaxy XR To Launch As Vision Pro Rival
Samsung Offers $100 Credit With XR Reservations
Stellantis Shifts U.S. Investment Away From EVs
Four Tech Settlements You Might Be Eligible For
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.