An internal OpenAI memo articulates a goal that would have been unimaginable just a few years ago: 250 gigawatts of compute capacity by 2033. According to independent analyses, running it around the clock would require as much power from the electrical grid as A.I. systems in all of India. The harsh reality is that the energy and climate demands are too extreme even with today’s cutting-edge renewable technologies.
The figure, which emerged in coverage of OpenAI’s long-term buildout, comes as part of a broader arms race, one involving Google, Amazon, Microsoft, Meta, and xAI all announcing hyperscale data center campuses and onsite generation projects. According to Truthdig’s modeling, if OpenAI hits their goal, it could lead IOCs or more in annual emissions from now until a business-as-usual level of global warming eats every dollar sent in its direction — unless the buildout is matched by an enormous expansion of clean power.

What 250 Gigawatts of Continuous Compute Really Means
If it were to run continuously at 250 GW, this would add up to around 2,190 terawatt-hours a year (250 x 8,760). To put that in context, all of India’s electricity consumption was something like 1,700–1,800 TWh in recent years according to both the International Energy Agency and national statistics. In other words, one company’s declared compute ambitions could require more energy than what one of the largest countries in the world now consumes over a year.
The comparison to today’s data center footprint is even more striking. The IEA estimated that global data centers used approximately 460 terawatt-hours in 2022, and they may consume anywhere from 620–1,050 TWh by 2026 as AI workloads and crypto bounce back. This is a goal large enough to dwarf planet-scale current worldwide data center consumption, necessitating either an absurd grid buildout or a miraculous surge of behind-the-meter generation.
And that’s just the electricity. PUE enhancements and liquid cooling can reduce overhead, but big AI clusters still need a lot of cooling and often a relatively large water draw. Water use is already significant, according to company sustainability reports: Microsoft said it consumed some 1.7 billion gallons in 2022 — a 34 percent increase over the previous year — while Google reported several billion gallons across U.S. sites. Scaling up to hundreds of gigawatts would amplify those local impacts unless designs move decisively toward water-poor cooling systems.
The Hardware and Supply Chain Crunch Facing AI Growth
Power is only one bottleneck. Reaching a 250 GW goal in 2033 would necessitate the yearly purchase of tens of millions of high-end GPUs, along with networking gear, memory, and advanced packaging capacity — resources that are already being stretched by an explosion in demand. The industry requires the output of approximately 10 leading-edge semiconductor fabrication plants maxed out to replenish the accelerators, according to analysis by Truthdig, not including energy, water, and chemicals used by those fabs.
“It’s made through an extremely carbon- and resource-intensive manufacturing process.” TSMC and Samsung have both committed to aggressive renewable procurement, but their thirst for electricity is growing rapidly with advanced node and packaging ramps. Packaging approaches like CoWoS have been chronic blockers for AI accelerators, and scaling them out demands more customized tools, trained labor, and reliable power in multiple geos.

There is a knock-on effect for everyone else: when hyperscalers tie up accelerator supply, smaller AI startups and research institutions end up paying higher prices and coping with longer lead times. Local grids can fall victim to the same squeeze. U.S. regional operators such as PJM and ERCOT are also facing large interconnection requests from data centers, while European hubs like Dublin have had connection limits put in place due to overflow from the local grid. The Netherlands went so far as to “temporarily halt” the issuing of permits for new hyperscale sites in 2022 to reconsider siting and infrastructure requirements.
Can Growth in A.I. Be Clean Without More Emissions?
How OpenAI and rivals get their power will decide whether the carbon footprint of AI booms or levels off. Short-term fixes — on-site gas turbines, backup generators, and diesel logistics — solve uptime but risk locking in emissions. The IEA has sounded a caution, too: If the world fails to build clean power quickly, growth in data centers will indeed drive electricity emissions higher — just as grids are supposed to be decarbonizing.
The alternative is to embrace scale and speed: long-duration PPAs for wind and solar with storage, grid-friendly load shifting, siting near abundant low-carbon baseload like hydro and nuclear. A few tech players are trying to think ahead to the next generation. A Microsoft-backed first-of-its-kind offtake deal with fusion startup Helion, and OpenAI CEO Sam Altman as chair for the advanced fission firm Oklo — are all strong signals that the sector is after firm, clean supply at multi-gigawatt scale. But fusion is as yet unproven at a commercial scale, and even new fission plants encounter long delays and regulatory hurdles.
Efficiency still matters. Chips-per-watt performance curve bending, workload scheduling, model pruning, and improved PUE can bend the curve. But historically, efficiency gains begat more demand. If AI adoption continues to accelerate — into search, software, science, and autonomics — this may mean that convex problems with linear growth will be outflanked by a usage explosion even if developers all internalize what the full energy cost of training and inference is.
The headline takeaway is simple but profound: A single company’s can-do dream, if realized as described, would make the energy demands of a major nation. That is not simply a corporate capex story; it’s an energy, climate, and industrial policy decision. Grid planners and regulators, from the United States to Europe to Asia, are already struggling with where to put these loads as well as how it will be powered cleanly and who should foot the bill for expansion. OpenAI has imagined the map; now we have to navigate it without overshooting the limits of our climate and communities.
