Business Advice Memoir

Waterwise Gridlock

With the AWS cloud storage outage that occurred yesterday, we are probably all wondering if we should be paying more attention to the AI and data center buildout issues that are being written about every day. I am also seeing more articles about how the data center demands on both power grids AND water supplies are starting to impact communities. Data centers use staggering amounts of both electricity and water and both are starting to create a major constraint on the AI boom that is so much a part of the 2025 economic miracle we are enjoying in the country. While the hype about the impact of AI is clearly overdone, it is easy to confirm that AI is currently contributing 1-2% in economic growth. With real GDP growth at 3.8%, that is still a very hefty (25-50%) portion of our economic story. In other words, if you want to think and talk intelligently about the AI economy, you had better understand the issues surrounding data centers, the electric grid and water availability for that purpose. Here on this hilltop, while we care about both electricity and water, we have the former more in-hand given the bright sunshine hours we enjoy and the proliferation of solar panels on our roofs and the Tesla batteries on the sides of our garages. My electric needs are not insubstantial given that I have a full EV in my truck and a plug-in hybrid now for Kim’s car. That means Kim is pretty much on the charger every night and I am on the charger more or less every third night. In fact, I am having a second charging station installed on our garage (this one on the outside) so that we don’t have to jockey cars around and we have a charge port for guests (who increasingly also have EVs or plug-in hybrids.

In addition to the vehicular charging, I have two AC units, a spa that runs 5-6 hours daily, and the normal array of household appliances (large refrigerator/freezer, large beverage and wine cooler, large ice-maker, two dishwashers, 4 TVs, a washer and a dryer) and all the full normal array of household lights and low-voltage outdoor lights all around the property. My solar/battery is unable to charge the vehicles (for that we use the grid or direct solar during the day). The batteries disgorge 80% of their load to cover the need and it’s probably still that we use grid power to a certain degree. It’s about as efficient as it can be without impinging on our chosen lifestyle (which means lots of AC most of the year).

The water use is both a bigger deal and probably on the verge of being a greater monthly expense (given cost mitigation from my solar/battery investment). I grant you that the spa and my 25-zone irrigation system (not to mention my proclivity for a verdant garden) are the biggest users. You cannot live in California and not be aware of water shortage. The good news is that among California counties, San Diego actually has a relatively better water situation compared to the other counties, but it’s complicated. San Diego has aggressively diversified its water sources over the past few decades through desalination (the largest in the Western Hemisphere, providing about 10% of the county’s water), water recycling, local groundwater, Northern California water via the California Aqueduct, and some from the problematic Colorado River. But despite all this diversification, we still get about 50-60% of our water from imports. The answer should be more desalination, but it is expensive….and it uses a bit too much electricity (Oops). So both water and electricity are short-supply resources, even for a forward-thinking and temperate place like San Diego.

Based on the latest data, the picture of US data center numbers and growth vary significantly depending on how you count. As of March 2025, the United States has approximately 5,426 data centers, the most of any country worldwide. This includes all types and sizes of facilities. There are 1,121 existing colocation data centers and 152 hyperscale self-built data centers currently operational. Virginia has the largest number of data centers in the US, with the city of Ashburn called “Data Center Alley” because of its high concentration of facilities. Major markets include Northern Virginia, Silicon Valley, Dallas, Chicago, and Phoenix, with other important hubs in Los Angeles, Seattle, and Atlanta. Northern Virginia alone has 300+ facilities with nearly 4,000 MW of power capacity, while Phoenix has 100+ centers with 1,380 MW. The US data center market was valued at $208.38 billion in 2024 and is projected to reach $308.83 billion by 2030, growing at a CAGR of 6.78%. Over 550 upcoming colocation and hyperscale self-built data center projects in the United States are expected to add over 90 GW of capacity, with several facilities still in the announced or planned stages awaiting grid connections. Goldman Sachs Research forecasts global power demand from data centers will increase 50% by 2027 and by as much as 165% by the end of the decade compared with 2023. Goldman Sachs Research further projects power demand will reach 84 GW by 2027, with AI growing to 27% of the overall market.

Demand for AI-ready data center capacity will rise at an average rate of 33% annually between 2023 and 2030 in a midrange scenario, meaning around 70% of total demand for data center capacity will be for data centers equipped to host advanced-AI workloads by 2030. By 2025, approximately 33% of global data center capacity will be dedicated to AI applications, a figure expected to reach 70% by 2030. McKinsey estimates that data centers equipped to handle AI processing loads are projected to require $5.2 trillion in capital expenditures by 2030, while those powering traditional IT applications are projected to require $1.5 trillion—nearly $7 trillion in total capital outlays needed by 2030. Combined investments from Microsoft, Amazon, Google, Meta, and Apple alone will exceed $450 billion in 2025. This represents one of the largest infrastructure buildouts in modern US history, comparable to past expansions of the highway system or electric grid itself.

Data centers use a massive and rapidly growing amount of electricity. Data centers consume approximately 4-5% of total US electricity – roughly 150-200 terawatt-hours (TWh) annually (that’s a lot). To put that in perspective, that’s more electricity than many entire countries use. Globally, data centers account for about 1-2% of worldwide electricity demand (around 200-250 TWh), though estimates vary and the US represents a disproportionately large share. It’s all about computing intensity. Projections suggest data center electricity use could double by 2030 – reaching 8-10% of US electricity demand. Some estimates go even higher – potentially 12-15% if AI deployment accelerates aggressively. The International Energy Agency projects global data center electricity consumption could more than double by 2026. In some regions of the US, data centers can represent 20-40% of local electricity demand, straining local grids. So, why are we not doing what imperialists usually do and putting data centers in less developed countries that can be bought and don’t know better? Because the US data center boom must happen primarily in the US, making the domestic grid capacity crunch unavoidable. You can’t just build data centers overseas and serve US users effectively – the laws of physics won’t allow it (mostly due to transmission degradation). This is why the collision between AI growth and US power grid capacity is so acute – there’s no escape valve of building capacity elsewhere.

And as for water… a single large data center can use 1-5 million gallons per day (equivalent to a town of 10,000-50,000 people). Google’s data centers used about 5.6 billion gallons in 2022. Microsoft used 1.7 billion gallons in 2023. Estimates suggest US data centers collectively use hundreds of billions of gallons annually. And AI multiplies the problem. AI chips apparently run much hotter than traditional servers, requiring more intensive cooling. Servers generate enormous heat that must be removed continuously or they’ll fail. High-performance AI chips (like Nvidia’s H100s) can consume 700+ watts each, and a data center might have tens of thousands of them packed together. Unfortunately, air conditioning just doesn’t cut it and can’t handle the heat density of modern AI chips. To make matters worse, data centers often cluster in areas with water stress (Arizona, Northern Virginia, parts of California) where they have to compete with agriculture, residential use, and ecosystems. Naturally, climate change amplifies the problem.

There’s often a water-electricity tradeoff. Water cooling is more energy-efficient (lower electricity use). Air cooling uses less water but more electricity. In a carbon-constrained world, we’ve optimized for electricity efficiency, but in water-scarce regions, this tradeoff needs rethinking. Some experts argue that water constraints could limit AI growth before electricity does, especially in the US Southwest where both tech companies and water stress are concentrated.

Man usually finds a solution to his problems. We’re good at that. Waterwise gridlock, as I call it, is our next big problem. We are also good at kicking cans down the road in our zeal to make a buck. Right now the big bucks are being made in AI investment (and, in truth, AI is generating revenue faster than most bubbles have in the past…even compared to the internet boom). But ironically, we may need to spend all our AI computational power to figure out how to avoid the collision with energy and water demand for us humans. It’s like when data overtook voice on phone lines…. We lowly humans need more gruel in our bowls too.