Back

OpenAI and Nvidia’s $100 B AI Plan Demands Power Equivalent to Ten Nuclear Reactors

OpenAI and Nvidia’s $100 B AI Plan Demands Power Equivalent to Ten Nuclear Reactors
Ars Technica2

Investment Structure and Market Reaction

Bryn Talkington, managing partner at Requisite Capital Management, highlighted the circular nature of the $100 billion investment, stating, "Nvidia invests $100 billion in OpenAI, which then OpenAI turns back and gives it back to Nvidia," and adding, "I feel like this is going to be very virtuous for Jensen." The comment underscores the strategic synergy between the two firms as they embark on an unprecedented AI infrastructure build‑out.

Cost, Scale, and Financial Outlook

During an earnings call, Nvidia CEO Jensen Huang explained that constructing one gigawatt of data‑center capacity costs between $50 billion and $60 billion, with roughly $35 billion allocated to Nvidia chips and systems. At that rate, a 10‑gigawatt project could require total investment exceeding $500 billion. Huang noted that the $100 billion infusion sits on top of Nvidia’s existing commitments and was not reflected in the company’s recent financial forecasts.

Energy Partnerships and Nuclear Tie‑Ins

Although OpenAI and Nvidia have not specified their power sources, the massive energy demand has spurred other technology leaders to pursue nuclear options. Microsoft signed a 20‑year agreement to restart the Three Mile Island reactor, delivering 835 megawatts. Amazon Web Services acquired a data‑center site adjacent to Pennsylvania’s Susquehanna nuclear plant, planning to draw up to 960 megawatts.

Regional AI Data‑Center Initiatives

In Cheyenne, Wyoming, officials announced plans for an AI data‑center that could eventually scale to 10 gigawatts, a capacity that would consume more electricity than all Wyoming homes combined in its earliest 1.8‑gigawatt phase. Constellation Energy CEO Joe Dominguez disclosed that Altman has expressed interest in building five to seven data‑centers of 5 gigawatts each. Alex de Vries of Digiconomist warned that seven 5‑gigawatt units would double the power consumption of New York State.

Global Energy Impact and Grid Constraints

The International Energy Agency estimates that data centers already accounted for roughly 1.5 percent of global electricity use in 2024. Continued AI expansion could push global data‑center electricity demand to 945 terawatt‑hours by 2030. Existing power‑grid connections represent bottlenecks in many regions, with utilities struggling to keep pace with rapid AI‑driven demand.

Future Outlook

Both companies expect to finalize the details of the partnership in the coming weeks. The scale of the investment, combined with the unprecedented power requirements, positions the OpenAI‑Nvidia alliance as a defining moment for AI infrastructure, while also highlighting the urgent need for sustainable, high‑capacity energy solutions.

Used: News Factory APP - news discovery and automation - ChatGPT for Business

Source: Ars Technica2

Also available in: