So Goldman Sachs released a report yesterday saying AI data centers will use 165% more electricity by 2030. Finally, someone with enough money to matter is saying what power engineers have been screaming about for two years: this AI bubble is going to break the fucking grid.
These Numbers Are Probably Wrong (But Still Terrifying)
Goldman says data centers eat around 2% of global electricity. That feels low - these facilities pull insane amounts of power. But even if that's right, doubling or tripling it by 2030 is nuts.
H100 racks burn through electricity like crazy. Facilities run hundreds of them 24/7 because you can't pause AI training when power costs spike. The numbers don't work.
The real problem? Nobody actually knows how much power these things will use. Every AI company is guessing based on optimistic projections that assume infinite efficiency improvements. It's like estimating highway capacity based on everyone driving perfectly.
The Grid Is Already Fucked
Here's what Goldman doesn't mention: utilities are freaking out. Grid operators are getting requests for facilities that need more power than cities. They have no clue when these will actually launch or if the infrastructure can handle it.
Most transmission lines were built in the 1960s for a grid that barely had air conditioning. Now AI companies want to drop facilities that use as much power as small countries in random rural areas with cheap land. The physics doesn't work.
Building transmission capacity takes decades and costs billions. But AI companies need power next year, not in 2040. MIT research suggests power capping processors could reduce energy usage by 20-40%, but nobody wants to talk about limiting AI performance.
Nuclear Is Too Slow, Renewables Are Too Unreliable
Microsoft and Google are signing nuclear deals like that'll fix everything. New nuclear plants take forever to build - look at Vogtle in Georgia, took over a decade and went way over budget. And that's considered success.
Solar and wind can't handle 24/7 AI training because the sun doesn't shine at night and wind doesn't blow on schedule. You need baseload power that runs constantly, which means coal, gas, or nuclear. And nobody's building new coal plants.
I honestly don't know how the math works. AI companies are betting their entire business model on cheap, unlimited electricity that probably doesn't exist.
The Hidden Costs Will Kill Everything
Goldman focuses on electricity consumption but ignores all the other infrastructure costs:
Cooling eats 30-40% of total power consumption because these chips run hotter than hell. Goldman Sachs confirms that 35-40% of hyperscaler energy consumption comes from cooling alone. That's on top of the 165% increase Goldman calculated.
You need redundant everything because AI workloads can't handle downtime. Backup generators, duplicate power feeds, redundant cooling - it all adds up.
Water usage is insane. These cooling systems drink water like California during a drought. Research shows AI models are massive water guzzlers for both server cooling and electricity generation. Data centers are already competing with cities for water rights across the western US.
When Reality Hits, It'll Hit Hard
If the power grid can't scale fast enough - and everything I've seen suggests it can't - then AI scaling hits a wall way before 2030. Data centers could consume 12% of US electricity by 2028, according to Lawrence Berkeley National Laboratory. All those venture capital investments betting on exponential AI growth are fucked.
Hundreds of AI startups assume they'll always have cheap cloud compute. When power constraints drive up data center costs, their business models collapse overnight.
The companies building their own data centers might survive. Everyone else gets priced out.
Nobody Wants to Admit the Math Doesn't Work
This is like the dot-com boom all over again, except instead of fiber optic cables we're running out of electricity. The underlying resource constraint is real, but everyone's pretending efficiency improvements will save them.
Maybe they will. Maybe AI chips will get way more efficient by 2030. Maybe fusion power will work. Maybe we'll solve physics.
But I've been in tech long enough to know that when the fundamentals don't add up, reality eventually wins. And the fundamentals of powering infinite AI growth don't add up.