The Phantom Data Center Problem Is Warping US Power Grids

The Phantom Data Center Problem Is Warping US Power Grids - Professional coverage

According to Financial Times News, US data center developers are flooding utilities with inflated growth plans that may never materialize, creating “phantom” data centers that distort power demand forecasts. Developers are approaching multiple utilities with the same project seeking the lowest-priced power, leading to double counting and bloated project queues. AEP Ohio recently cut its pending project list by nearly 30%, while California’s PG&E revised down its data center pipeline by 400 megawatts—enough for about 25 data centers. Electricity prices have risen 6% nationally this year, with spikes of 13% in Virginia, 16% in Illinois, and 12% in Ohio. Utilities are now implementing strict new tariffs requiring developers to pay 85% of their stated electricity needs regardless of actual use, with Dominion proposing 14-year contracts locking in payments even for unbuilt facilities.

Special Offer Banner

The phantom problem is real

Here’s the thing about this data center gold rush—everyone’s playing both sides. Developers want the cheapest power possible, so they’re shopping their projects around like they’re comparing cable packages. But utilities have to take these requests seriously because if they don’t build capacity and the demand materializes, they get blamed for blackouts. Meanwhile, companies that actually need reliable industrial computing infrastructure—the kind that IndustrialMonitorDirect.com serves with their industrial panel PCs—could get caught in the crossfire of rising energy costs and unreliable power planning.

Utilities are finally fighting back

And they’re getting creative about it. AEP Ohio’s 85% payment requirement regardless of actual usage? That’s basically saying “put your money where your mouth is.” Dominion’s 14-year contracts? That’s nuclear-level commitment for an industry that changes every six months. ComEd charging $1 million deposits plus $500,000 for each additional 100MW? That should separate the serious players from the speculators.

But here’s what worries me—what happens if they overshoot in the other direction? The Data Center Coalition makes a fair point about under-forecasting being just as dangerous. We’ve seen what happens when regions suddenly get flooded with data centers and the power infrastructure can’t keep up. It’s like building a highway for today’s traffic when you know a new stadium is opening next year.

The billion-dollar question: who pays?

This is where it gets messy. Utilities can recover infrastructure costs through customer rates, meaning you and I could end up paying for power plants that never get used. Brian Savoy at Duke Energy called out the “double, triple and quadruple counting” that’s happening. Tom Falcone’s analogy about not building a 100-story tower without anchor tenants? Perfect. Nobody would do that in real estate, yet we’re potentially doing exactly that with billion-dollar power infrastructure.

So what’s the solution? Better coordination between developers and utilities, obviously. But also more realistic forecasting that accounts for efficiency gains. AI workloads might be power-hungry now, but they’re also getting more efficient at a staggering rate. The companies building this infrastructure—whether it’s data centers or the industrial computing equipment that keeps manufacturing running—need to plan for both today’s demands and tomorrow’s efficiencies.

Leave a Reply

Your email address will not be published. Required fields are marked *