Loading the Elevenlabs Text to Speech AudioNative Player...
The following article was written by Santiago Gallino, professor of operations, information and decisions at Wharton.

In December, the nation’s largest grid operator, PJM Interconnection, failed for the first time in its history to procure enough electricity to keep the lights on reliably — falling nearly 6,600 megawatts short of its reserve target for summer 2027. The cause was not a hurricane or a war. It was data centers. Ninety-four percent of the projected load growth came from facilities built to run artificial intelligence. Capacity prices hit an all-time high, and the political backlash has been swift and bipartisan, from Senator Bernie Sanders calling for a moratorium on data center construction to Governor Ron DeSantis rallying Floridians against proposed campuses in their communities.

This is not just an energy policy story. It is a supply chain story — and it carries a warning that every business leader should hear: The AI revolution is being built on an infrastructure foundation that cannot keep pace with the demand being placed on it.

Lead Times Are the Strategy

In supply chain management, one lesson endures: When lead times are long and demand is uncertain, the decisive move is made years before anyone places an order. It is made when someone decides how much capacity to build and which supplier relationships to invest in. The AI energy crisis is what happens when that lesson is ignored at scale.

Consider the hardware required. Large power transformers now take roughly two-and-a-half years to procure. Gas turbines are booked well into the late 2020s. New transmission lines require years of permitting before a single steel tower is erected. These are not footnotes to the AI buildout — they are its binding constraint. Excitement about AI runs at the speed of software; the infrastructure required to sustain it runs at the speed of steel, copper, and concrete. When those two clocks fall out of sync, the result is not a polite delay. It is a hard ceiling on growth.

Any company that treats energy infrastructure as something to be procured when needed, rather than secured years in advance, is already behind. The hyperscalers that moved earliest to lock in power purchase agreements and reserve equipment slots are effectively erecting barriers to entry that latecomers will struggle to overcome, regardless of how much capital they deploy.

Excitement about AI runs at the speed of software; the infrastructure required to sustain it runs at the speed of steel, copper, and concrete.

You Cannot Scale Faster Than Your Slowest Supplier

In the public imagination, the AI supply chain is about chips, algorithms, and data. In reality, it also depends on a handful of specialized manufacturers in South Korea and Germany who make the silicon steel inside power transformers, the forging shops in Europe that produce gas turbine shafts, and the small number of firms that manufacture the critical components through which electricity enters a substation. These suppliers are invisible until they become the bottleneck — and by then, the cost of delay has already compounded.

This is a lesson COVID-19 taught the semiconductor industry: Global just-in-time optimization creates catastrophic fragility when stress arrives. That same stress is now arriving in energy infrastructure. GE Vernova’s recent $5.3 billion acquisition of full ownership of transformer manufacturer Prolec GE signals exactly how the smartest players are responding. They are not buying transformer businesses because transformers are glamorous; they are buying vertical integration into a chokepoint that their customers cannot reach their AI ambitions without.

The Hype Horizon vs. the Hardware Horizon

A large language model can be trained and deployed in months. The power plant needed to run it at scale takes the better part of a decade to permit, build, and commission. Data center development is already slowing as developers discover that available grid capacity, not capital, is the constraint. In PJM’s territory alone, data centers are projected to add five to seven gigawatts of demand each year, while only two to three gigawatts of new generation come online. That math does not resolve itself.

For business leaders, this asymmetry demands clear-eyed planning. If only a fraction of the AI infrastructure pipeline is built and demand does materialize, the grid will be inadequate. If the infrastructure is overbuilt and demand disappoints, stranded assets will be paid for by electricity ratepayers for decades. Avoiding either outcome requires the kind of long-horizon thinking that is the antithesis of the quarterly earnings cycle driving most corporate decisions.

A large language model can be trained and deployed in months. The power plant needed to run it at scale takes the better part of a decade to permit, build, and commission.

The Lights vs. the Algorithm

There is a final, uncomfortable dimension to this story. Electricity powers hospitals, water treatment plants, and heating systems. AI, for all its promise, remains — for most current applications — a convenience layer on top of civilization, not its foundation. When a megawatt allocated to a data center is a megawatt unavailable to a hospital or a residential neighborhood during a heat wave, that allocation becomes a societal choice, not merely a market transaction.

Regulators are already responding. PJM recently announced reforms requiring data centers that have not secured their own power supply to be curtailed first during emergencies — before residential customers. The same principles that govern military procurement and pandemic logistics — prioritization, resilience, sequencing of critical over noncritical demand — are now being applied to AI’s energy footprint.

The implication for corporate strategists is not to bet against AI. The technology is real, the demand is real, and the supply chain will eventually catch up. The implication is to build with clear eyes: The infrastructure underpinning AI is not a background variable to be optimized away, but the most consequential supply chain decision of this generation. Get it right, and the AI revolution delivers on its promise. Get it wrong, and the bottleneck will not be a chip shortage or a software bug. It will be the most fundamental constraint of all: not enough electricity to go around.

Comments

New This Week

Lasting Loyalty: Why ‘Unreasonable Hospitality’ Wins

Lasting Loyalty: Why ‘Unreasonable Hospitality’ Wins

May 12, 20265 min read

In this Nano Tool for Leaders, restaurateur Will Guidara explains how “unreasonable hospitality” helps companies recover from failure and foster lasting loyalty.

Why Leadership Changes Often Backfire
Podcast

Why Leadership Changes Often Backfire

May 12, 202615 min listen

Professor Katherine Klein talks about leadership succession and what employees really want.

Rethinking Urban Tax Policy Through Land Value Taxation
Podcast

Rethinking Urban Tax Policy Through Land Value Taxation

May 8, 202612 min listen

Wharton professor emeritus of finance explains how taxing land instead of buildings or income could help cities raise revenue while encouraging economic growth.