OpenAI has officially appealed to the Trump administration to expand the Chips Act’s Advanced Manufacturing Investment Credit to apply to the backbone of artificial intelligence infrastructure, including parts of the electrical grid as well as AI servers and full-scale data centers. In a letter from Chris Lehane, the company’s chief global affairs officer, to Michael Kratsios, the White House science and technology chief, OpenAI says it believes broadening the incentive beyond chip fabrication would speed domestic AI build-out, as well as decreasing its financing risk.
What OpenAI is proposing for AI data center tax credits
Currently, the Advanced Manufacturing Investment Credit offers a 35% tax credit linked to the Chips Act, which is intended primarily to boost U.S. chipmaking capacity. OpenAI is asking that its request be interpreted broadly enough to include the capital-intensive infrastructure needed to deploy those chips at scale. The letter also calls for speeding up permitting and environmental reviews for AI campuses, and says that it will be necessary to build a strategic reserve of critical materials, like copper, aluminum, and processed rare earths, in order to unclog supply lines too.

OpenAI describes the change as a way to reduce the cost of capital, de-risk early-stage investments, and make private markets more liquid. Putting 35% less on a multibillion-dollar data center campus would mean materially lower up-front outlays and a lower weighted average cost of capital, which most commonly adds to adoption timetables as business drivers move toward hyperscale consumption.
How tax policy meets AI infrastructure and deployment
In the beginning, United States policymakers designed the Chips Act around semiconductor sovereignty, matching direct grants with a tax credit to reshore fabrication. OpenAI’s pitch goes all the way down to the next link in the chain: the compute, storage, networking, and power systems that turn chips into capacity for training and serving large AI models. It will all revolve around Treasury guidance; what exactly counts as “advanced manufacturing” versus downstream deployment and demand could decide whether servers, power-hungry accelerators, liquid-cooling plants, substations, and transformers are in or out.
The stakes are large. According to Synergy Research, hyperscale capital expenditures now exceed $200 billion on an annual basis, and AI has reversed spending priorities in favor of high-density compute and advanced power/cooling. Goldman Sachs analysts have forecast that data centers could represent ~8% of U.S. electricity load by 2030, suggesting the largest level of grid investment ever seen. It would shift billions in after-tax economics that could take some of the location bias out of site selection toward U.S. locations with land, water, and interconnection.
Permitting and power represent the biggest AI bottlenecks
OpenAI’s letter highlights two pain points: grid access and permitting. Queues for connection of large loads have become bloated in many places, with multi-year timelines listed by utilities for transmission upgrades. Industry surveys suggest that lead times for large power transformers may be 100+ weeks, which could restrict the expansion of campuses of 100–300 MW and gigawatt-scale clusters required for next-generation AI models. A federal acceleration of National Environmental Policy Act reviews, coupled with a materials reserve, could move critical-path pieces — provided the plan is coordinated with the Department of Energy and grid operators.

The materials ask is notable. There are large amounts of copper in high-density data centers, from switchgear to bus ducts to giant runs of cable, and rare-earth demand is linked more with high-efficiency motors (and some power electronics). Ensuring reliable supply at known prices is as much a project finance problem as it is an industrial policy question, and in effect OpenAI is arguing that supply shocks are now macro risks to U.S. AI competitiveness.
Executive messaging and the scale of the AI market plan
The request received added attention after a short-lived swirl around whether OpenAI was seeking loan backstops from the Federal Reserve. CFO Sarah Friar has gone on the record that the company is not seeking government guarantees to support data center financing. And CEO Sam Altman said, similarly, OpenAI doesn’t want guarantees for its data centers, even as he acknowledged that it could make sense to help jump-start U.S. semiconductor fabs — tied to the actual national-security premise of the Chips Act.
OpenAI underscored the scale of its ambitions, describing an annualized revenue run rate of over $20 billion and capital commitments on the order of $1.4 trillion over eight years. These figures reflect the increasing price of frontier AI, in which compute demand doubles quickly and each new generation demands higher-capacity clusters, more energy, and denser cooling. In that environment, tax policy and permitting function as levers that either unlock or impede private investment.
Policy outlook and industry reaction to OpenAI’s request
Any increase in the credit is likely to entail conversations between the Treasury Department and the Commerce Department, as well as with the White House Office of Science and Technology Policy — not to mention some input from DOE regarding grid constraints. A chip booster might support the targeted steps intended to bolster domestic demand for advanced chips, while fiscal hawks may gripe about mission creep and overlap with subsidies under other energy and manufacturing incentives.
Siting and load management will be in the hands of utilities and regional grid operators. That implies a call for demand response, on-site generation, and co-location with new transmission to minimize system stress. Labor and construction groups, meanwhile, could get behind the move because of the number of skilled jobs associated with substations, chillers, and electrical work on AI campuses. By how it proceeds — whether the administration pursues regulatory interpretation or a new law — Washington will telegraph how it plans to weigh semiconductor strategy against an equally challenging project: powering the AI era.
