Leaked internal documents are giving the clearest picture yet of how much OpenAI is paying Microsoft under a multiyear partnership that’s resulted in an exclusive license on one of the most sophisticated AI models ever built, along with “tech support” from the company’s army of engineers.
What the Leaked Documents Claim About Payments
OpenAI paid Microsoft $493.8 million in 2024 and $865.8 million through the first three quarters of 2025 in revenue-share payments, according to materials reviewed by tech blogger Ed Zitron. The documents point to a revenue share that has been widely reported as 20% of OpenAI’s sales, though neither company has publicly confirmed that or acknowledged the leaked numbers.
- What the Leaked Documents Claim About Payments
- What the math means for OpenAI’s reported sales
- The cost side and rising margin pressure at scale
- Why Microsoft benefits financially regardless of outcomes
- Cloud dependence and diversification across providers
- What to watch next for OpenAI, Microsoft, and Azure

The payments are part of an unusual partnership in which Microsoft has invested $1 billion into OpenAI, a research lab founded by some of the most prominent people in the tech industry to ensure artificial intelligence doesn’t destroy humanity. Microsoft and OpenAI have agreed to jointly develop a number of new technologies under their Azure for Gaming initiative, and moreover integrate OpenAI’s models within its products. The information is a rare, granular window into the economics of the deal, because while Microsoft says AI offerings are helping lift Azure growth, it does not break out Azure OpenAI or revenue-share inflows from OpenAI in its filings.
What the math means for OpenAI’s reported sales
If the 20% figure is accurate, the leaked payment totals would mean OpenAI pulled in about $2.469 billion in revenue for 2024 and around $4.329 billion over just the first nine months of 2025. Those estimates roughly align with previous reporting from The Information, which put OpenAI’s projected 2024 revenue in the neighborhood of $4 billion and hinted at a steep acceleration in 2025.
OpenAI’s leadership has also cast the business as scaling rapidly, with public comments suggesting annualized revenue run rates well above $20 billion. Run rates are projections, not recognized revenue, but they go far in explaining why Microsoft’s percentage is increasing: More customers have purchased ChatGPT Enterprise, API credits, and GPT-enabled workflows — all of which contribute to OpenAI’s top line and Microsoft’s participation fee along with cloud consumption.
The cost side and rising margin pressure at scale
Zitron’s analysis indicates that OpenAI paid approximately $3.8 billion for inference in 2024, which leapfrogged to nearly $8.65 billion over the first nine months of 2025. Inference — the cost to pay for compute to run model queries — has become the primary key cost driver behind large-scale AI services. Previous reporting has OpenAI’s overall spend on compute reaching roughly $5.6 billion in 2024, and a cost of revenue around $2.5 billion in the first half of 2025, indicating just how costly it can be to serve AI at consumer and enterprise scale.
If the leaked inference totals are accurate, they suggest that there have been times when OpenAI’s cost for serving queries outstripped its recognized revenue — pressuring gross margins, even as usage rose. That would be consistent with the industrywide forces at work: a GPU shortage, stiff premiums on state-of-the-art accelerators, and the baggage of carrying capacity for unpredictable demand. It also helps explain OpenAI’s continued efforts to develop what they call efficient models, caching, and specialized hardware strategies to bend the cost curve.
Why Microsoft benefits financially regardless of outcomes
They make economic sense for Microsoft on multiple levels. The company earns a revenue share on OpenAI and also monetizes the underlying compute via Azure. Executives have on earnings calls pointed to AI workloads as a material contributor to Azure’s growth, even as they acknowledge the near-term margin drag that tends to accompany rapid scaling of infrastructure.
Most crucially, the terms of the deal work both for platform adoption and for cash flows to Microsoft. Each ChatGPT query or enterprise API call that computes on Azure deepens customer stickiness in Microsoft’s cloud, promoting cross-sell of data, security, and developer services. The leaked payment numbers suggest a reinforcing loop: more revenue to OpenAI means higher payments to Microsoft, which results in greater Azure consumption.
Cloud dependence and diversification across providers
Azure is OpenAI’s main compute home, but it has been on the lookout for more capacity via partners like CoreWeave and Oracle and had considered deals with AWS and Google Cloud as well. That diversification can enhance resiliency and pricing leverage, but it also complicates cost management — and does not necessarily change the basic revenue-share mechanics with Microsoft pegged to OpenAI’s overall sales.
For businesses, the message is simple: behind the polished interfaces of generative AI lies a chunky infrastructure bill — one that can be variable to boot. Unit economics will be determined by model size, prompt complexity, and optimizations (e.g., quantization, retrieval), along with contract terms with cloud providers. Those levers will decide how swiftly AI services can shift from hypergrowth to sustainably profitable.
What to watch next for OpenAI, Microsoft, and Azure
The leaked figures pose three crucial questions.
- First, how fast can OpenAI drop the cost of inference units under increasing usage?
- Second, will Microsoft start breaking out more AI-specific metrics that include how much of Azure’s growth is tied to OpenAI traffic and revenue-share inflows?
- Third, what are the pricing, model efficiency, and potential for custom silicon to change margins over the next product cycles?
Until those answers firm up, the leaked documents offer a rare, data-backed peek into one of tech’s most important partnerships — a partnership in which every new user isn’t just testing out the latest AI model but moving real money between two of the industry’s most-watched companies alike.