FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

OpenAI Bets Big on AMD With World’s Largest Chip Deal

Bill Thompson
Last updated: October 6, 2025 5:22 pm
By Bill Thompson
Technology
7 Min Read
SHARE

OpenAI is making a big bet on AMD, inking an alliance that will dedicate up to six gigawatts of AI computing capacity toward running its software stack on AMD silicon over the next few years, with the first gigawatt expected by next year. Detailed by the companies in the announcement, the agreement also paves a way for OpenAI to hold a stake of up to 10 percent in AMD, and an AMD exec has confirmed that it is anticipated that tens of billions of dollars in revenue will come from working with OpenAI.

“This partnership is an important step to expand the computing power required to achieve our vision of creative AI,” said OpenAI CEO Sam Altman, adding that AMD’s high-performance chips would help speed up shipments of more powerful AI systems sooner.

Table of Contents
  • Why AMD Is a Major Factor in the AI Hardware Battle
  • The Math and Logistics Behind Six Gigawatts of AI
  • Supply Chain and Pricing Implications for AI Chips
  • What to Watch Next as OpenAI Scales on AMD Hardware
OpenAI and AMD logos with AI chips, illustrating the largest semiconductor deal

The move comes days after OpenAI announced a different, and larger, buildout on Nvidia systems that would be capable of around 10 gigawatts.

Fortune has reported that it could use up as much electricity as all of the largest U.S. cities combined—a sobering reminder that AI’s next steps forward may rely on little more than raw energy and chips to efficiently transform it into usable computation.

Why AMD Is a Major Factor in the AI Hardware Battle

Nvidia has been the king of cloud AI acceleration for most of the past 2 years. Analyst firms such as Dell’Oro Group and Omdia have pegged Nvidia as holding the lion’s share of AI accelerator revenue due to its maturest software ecosystem and proven largest supply scale. AMD’s counterpunch is the MI300 family, which are high-memory accelerators for large language model training and inference alongside the company’s ROCm software stack.

Already a number of hyperscalers have adopted AMD’s upcoming MI300X parts, including Microsoft sharing where the chips are available in its cloud, and numerous AI startups previewing promising cost-per-token when memory bandwidth is bottlenecked.

A serious OpenAI deployment would sharpen that edge—large real-world workloads show the Achilles’ heel of software pretty quickly, and there are always countless optimization loops that flow back to the general developer ecosystem.

Strategically, OpenAI is also signaling supplier diversification. Securing capacity with both AMD and Nvidia drives the single-vendor risk down, among other benefits propelling negotiating strength, while spreading exposure across two different product roadmaps—Nvidia’s Blackwell platform on one hand, and AMD next-gen accelerators on the other.

The Math and Logistics Behind Six Gigawatts of AI

Six gigawatts is utility-scale infrastructure—the equivalent of what several large power plants produce—and it’s emblematic of the clash between AI’s aspiration as an industry and the natural limitations of energy, land and cooling. The International Energy Agency has projected that global data center electricity use could nearly double mid-decade, fueled largely by AI workloads. Power is not a back-office detail anymore; it’s the gating constraint.

Transporting just one gigawatt of AI-ready capacity at a time demands coordination on grid interconnections, substation buildouts and advanced cooling. Once an enthusiast’s niche, liquid cooling is quickly becoming mainstream for dense AI clusters. Rhodes said that industry groups such as the Uptime Institute were tracking how concerns about thermal design limits and access to water are influencing site selection, in places including the American Southwest and Northern Europe.

OpenAI and AMD logos over AI chips, signaling the world's largest chip deal

For OpenAI, that means the AMD deal is about power engineering and procurement—long-term renewables contracts, on-site generation, demand response—as much as it is chips.

It’s also a sign to utilities that the arrival of hyperscale AI demand is coming sooner than anybody planned for.

Supply Chain and Pricing Implications for AI Chips

On the manufacturing side, gridlock is clearly behind advanced packaging at TSMC and high-bandwidth memory provided by SK hynix, Micron, and Samsung. In other words, committing to multi-gigawatt deployments is really just about reserving scarce input years in advance. That’s leverage for AMD, which can process learned-on data demand into long-term wafer and HBM allocations—and most importantly, competitive (reducing) pricing for OpenAI.

If/when OpenAI does embed with a substantial equity stake, it might bring further alignment on roadmap visibility and priority access. For AMD, tens of billions in incremental revenue could help grow data center share, fund R&D and leverage the expensive packaging and interconnect technologies necessary for next-generation accelerators.

For the broader market, that will mean tighter supply. As Nvidia ramps Blackwell and AMD iterates beyond MI300, large pre-allocations can mean that accelerators remain scarce for smaller customers. Such shortages tend to drive up the cost of cloud services and delay how long it takes startups to deploy, while creating bias for firms with early bookings and good credit.

What to Watch Next as OpenAI Scales on AMD Hardware

Everyone’s going to be looking for that first gigawatt and the performance-per-watt that OpenAI pulls from AMD’s stacks at scale. Watch for software updates in ROCm, model-serving improvements that take advantage of large on-package memory, and signs that training runs are transitioning smoothly between AMD and Nvidia fleets.

Also part of the story, right alongside the energy one: power purchase agreements, grid partnerships and anything that moves on-site generation forward while protecting it from arrival delays getting interconnected. Regulators and energy planners will analyze the impact of these clusters on local grids, while investors parse whether multi-vendor strategies reduce the unit economics of AI inference.

The headline is simple—OpenAI is all in on AMD. The subtext says more: AI’s new frontier rides on the back of a three-way horse race between chips, software and electricity. OpenAI’s AMD deal is a bet that balancing all three, and across more than one supplier, is how you win.

Bill Thompson
ByBill Thompson
Bill Thompson is a veteran technology columnist and digital culture analyst with decades of experience reporting on the intersection of media, society, and the internet. His commentary has been featured across major publications and global broadcasters. Known for exploring the social impact of digital transformation, Bill writes with a focus on ethics, innovation, and the future of information.
Latest News
Sora Needs Copyright Holders to Opt Out of Inclusion
Early Target Circle Week deals compared with Prime Day
SwitchBot Safety Alarm Adds Smart Ghost Call Protection
Android Auto GameSnacks Could Be Phased Out Soon
AirPods 4 Falls to New All-Time Low at Sub-$90 Pricing
AT&T Yearly Phone Upgrades With Home Internet
Microsoft Goes Solar in Japan with 100 MW Deal
Why elementary OS Is My All-Time Favorite Linux Distro
A $7 AirPods cleaning pen that actually does the job
OpenAI Bolsters API Displaying More Powerful Models
MrBeast: ‘AI Will Destroy Livelihoods of Creators’
Amazon Prime Day Samsung Deals: Save Up To $500
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.