US data center construction hit something like a $40 billion annual pace in June 2025—up roughly 30% from last year and about 50% from 2024, though the exact numbers change depending on who's counting. That's a lot of concrete and steel being poured to house AI infrastructure that burns through electricity like crazy. But here's what nobody's talking about: where the fuck is all this power going to come from?
Microsoft, Google, and Amazon are building these facilities like there's no tomorrow, betting everything that AI workloads will justify the insane capital costs. Each new AI data center uses 10-100x more power than traditional cloud infrastructure. The International Energy Agency projects data center electricity demand will more than double by 2030 to around 945 TWh globally. We're talking about facilities that consume more electricity than small cities, and tech companies are building dozens of them simultaneously across the US. In the US alone, data centers are expected to consume 580 TWh annually by 2028, accounting for 12% of the nation's total electricity use.
The Texas power grid nearly collapsed during a winter storm because of normal demand. Now imagine adding hundreds of AI data centers that each need constant, massive power. America's largest power grid is already struggling to meet demand as AI chatbots consume power faster than new plants can be built, and grid stability is decreasing according to reliability assessments. California's grid can barely handle air conditioners in summer, but somehow it's supposed to power ChatGPT queries? The math doesn't add up, but nobody wants to be the one to say it out loud.
Local Communities Are Getting Screwed by This Gold Rush
Here's what happens when Big Tech rolls into your town promising jobs and tax revenue: they build a massive data center that uses more electricity than the rest of the city combined, drives up local power costs, and employs maybe 50 people once construction is done. Virginia residents are already seeing this with Dominion Energy raising rates by about 15% to fund grid upgrades for data center demand. In Ohio, typical household electricity bills increased $15 per month starting in June specifically because of data centers, and this pattern is spreading nationwide as AI data centers strain local power grids.
These facilities need insane cooling systems because AI chips run hot as hell - I've seen server rooms where you can feel the heat from 20 feet away. We're talking about industrial-scale air conditioning running 24/7/365 to prevent millions of dollars worth of GPUs from literally melting. The cooling alone eats up around 40% of a data center's power consumption, which explains why Microsoft is experimenting with underwater data centers and other desperate cooling solutions.
Nvidia Is Making Bank While Everyone Else Pays the Bills
Jensen Huang should send thank-you cards to every AI CEO because they're basically printing money for Nvidia. Each AI data center needs thousands of H100 chips at $25,000-40,000 each, plus the specialized networking hardware to connect them. A single AI training cluster can cost well over $100 million just in hardware, before you even start on the building, power, and cooling.
But here's the kicker: these data centers might be obsolete in 3-5 years as AI models change or the bubble pops. Traditional data centers have 15-20 year lifespans. AI data centers are betting everything on continued exponential growth in AI demand, and if that growth slows or AI efficiency improves dramatically, billions in infrastructure becomes expensive concrete boxes full of outdated hardware.
The real question nobody's asking: what happens to all this infrastructure when companies realize they're spending $115 billion to maybe break even on AI? We might be building the data center equivalent of shopping malls—massive investments that seemed essential until they suddenly weren't.