AI Data Center Electricity Costs Are Landing on Your Bill Right Now

Craig Nash
By
Craig Nash
AI-powered tech writer covering artificial intelligence, chips, and computing.
9 Min Read
AI Data Center Electricity Costs Are Landing on Your Bill Right Now — AI-generated illustration

AI data center electricity costs are no longer an abstract infrastructure problem — they are showing up in household electricity bills right now, and the numbers are stark. US data centers consumed around 80 GW of power in 2025 and are projected to reach 150 GW by 2028, according to a January 2026 Bloom Energy report. That three-year jump is equivalent to adding Spain’s entire energy demand to the US grid. States are starting to push back, but the bill has already arrived for millions of households.

Key Takeaways

  • US data center energy demand is projected to nearly double from 80 GW in 2025 to 150 GW in 2028, per Bloom Energy.
  • Electricity prices in high data center concentration areas rose 267% over the past five years, according to Bloomberg analysis.
  • Removing data centers from PJM’s 2026/27 forecast would cut capacity payments by $9.33 billion — a 64% reduction.
  • In Virginia, 75% of voters blame data centers for rising electricity bills, per a January 2026 survey.
  • A Virginia resident’s bill spiked from around $100 to $281 in January 2026, illustrating the household-level impact.

How AI data center electricity costs are hitting households

The mechanism is straightforward but rarely explained. When data centers flood a grid region with demand, grid operators must procure more generation capacity to keep the lights on — and those capacity costs get spread across all ratepayers, residential and commercial alike. It’s a socialization of costs that benefits tech companies while the burden falls on everyone else.

Virginia’s Data Center Alley is the clearest example. PJM Interconnection, the grid operator covering much of the eastern US, saw real-time electricity prices spike to $1,800 per MWh in Virginia’s Dominion zone during the 2025/26 winter — compared to $700 per MWh system-wide. Virginia resident John Steinbach told Consumer Reports his January 2026 bill hit $281, roughly triple what he had been paying before. He is not an outlier. A January 2026 survey by the Global Strategy Group and the Chesapeake Climate Action Network Action Fund found that 75% of Virginia voters directly blame data centers for their rising costs.

A single hyperscale data center draws around 100 MW — enough to power 100,000 households. Meta’s Hyperion project in Louisiana requires at least 5 GW, which is three times the total electricity consumption of New Orleans, according to the International Energy Agency and the Institute on Taxation and Economic Policy. When facilities at that scale connect to a regional grid, the capacity math changes for every other customer on that grid.

The PJM capacity payment problem

PJM’s 2026/27 capacity auction data makes the scale of AI data center electricity costs impossible to ignore. PJM’s Independent Market Monitor analysis found that removing all data centers from the forecast would cut peak load by 7,927 MW and reduce capacity payments by $9.33 billion — a 64% drop. Even removing only the data centers not yet energized cuts payments by $7.74 billion, a 53% reduction.

PJM attributes roughly 7.9 GW of additional data center demand in 2025/26 and approximately 12 GW in 2026/27, which has effectively doubled capacity costs for those periods. That doubling doesn’t happen in a vacuum — it gets baked into the rates that households and small businesses pay, whether or not they benefit from the AI infrastructure driving the demand.

There’s also a reliability question buried in PJM’s own forecasting record. The 2024 data center load came in 800 MW below the 2023 forecast, and the 2025 figure came in 1.1 GW below the 2024 forecast. If load projections consistently overshoot, grid operators may be procuring — and billing ratepayers for — capacity that never gets used. That’s a structural problem with how speculative data center demand gets treated in capacity markets.

How does PJM compare to ERCOT on data center costs?

Texas’s ERCOT grid operates on a fundamentally different model. ERCOT pays generators only when they’re actually producing during periods of tight supply, which creates a direct incentive for reliability. PJM, by contrast, pays plants regardless of whether they perform — including during cold weather events when some have historically failed to deliver. That structural difference matters when evaluating whether capacity payment inflation driven by data centers is an inevitable cost or a policy choice that could be redesigned.

The ERCOT approach isn’t perfect — Texas’s grid has its own well-documented vulnerabilities — but the contrast illustrates that how a grid operator structures its market directly determines who absorbs the cost of demand spikes. PJM’s model, as currently designed, socializes data center-driven costs across all ratepayers. That’s the core of the growing state-level revolt.

Will the state revolt actually change anything?

More US states are now examining whether large industrial customers like data centers should bear more of the capacity costs their demand creates, rather than spreading those costs across residential ratepayers. The political pressure is real: a 267% electricity price increase in high data center concentration areas over five years, per Bloomberg, is not something voters ignore.

The outcome isn’t guaranteed. Data center operators argue they create jobs, tax revenue, and critical digital infrastructure. Utilities argue that grid upgrades benefit everyone. But when 75% of voters in the most data center-dense region in the country say their bills are being driven up by tech infrastructure, the political calculus is shifting. The question is whether regulatory reform moves fast enough to outpace the next wave of AI buildout.

Is it fair to make households pay for data center grid costs?

That’s the central dispute. Under current PJM market rules, capacity costs driven by data center demand are spread across all ratepayers — residential, commercial, and industrial. Critics argue this socializes the costs of private AI infrastructure onto households that receive no direct benefit. Supporters argue that grid capacity is a shared resource and all users benefit from a reliable grid.

How much are data centers actually adding to electricity bills?

The PJM Independent Market Monitor estimates that data centers account for $9.33 billion in capacity payments in the 2026/27 auction period alone, based on the full data center load forecast. In Virginia’s Dominion zone, one household saw their bill triple between prior years and January 2026. Bloomberg’s analysis found electricity prices rose 267% over five years in areas with high data center concentrations.

What is PJM Interconnection and why does it matter here?

PJM Interconnection is the regional transmission organization managing the electricity grid across 13 US states and Washington DC, covering around 65 million people. It runs capacity auctions that determine how much generators get paid to be available — and those auction prices directly affect what ratepayers pay. Because Data Center Alley in Northern Virginia sits within PJM’s territory, the region has become ground zero for the data center electricity cost debate.

The AI buildout isn’t slowing down, and neither is the electricity demand it creates. The real question isn’t whether AI data center electricity costs will keep rising — the Bloom Energy projections and PJM auction data make that trajectory clear. The question is whether regulators will restructure who pays for that growth before the next wave of hyperscale projects comes online. Right now, the answer is your electricity bill.

This article was written with AI assistance and editorially reviewed.

Source: TechRadar

Share This Article
AI-powered tech writer covering artificial intelligence, chips, and computing.