In 2024, Google’s data centers consumed 6.1 billion gallons of water. Microsoft used 7.8 billion gallons. Meta’s facilities in Arizona and New Mexico drew water from aquifers that are already critically depleted.
These numbers are growing at 20-30% year-on-year, driven almost entirely by AI training and inference workloads that generate significantly more heat per rack than traditional compute.
Why Data Centers Need So Much Water
Most large data centers use evaporative cooling — essentially, they spray water across heat exchangers to absorb waste heat through evaporation. It’s efficient at removing heat, but it’s fundamentally consumptive. The water doesn’t come back.
A typical 100MW facility consumes 3-5 million gallons of water per day. That’s equivalent to the daily water consumption of a city of 50,000 people.
Liquid cooling — often presented as the solution to AI-density thermal challenges — actually makes the problem worse. Direct-to-chip liquid cooling handles higher heat densities brilliantly, but it requires 3-5x more water per rack than traditional air cooling once you account for the secondary rejection loop.
The Regulatory Squeeze
Regulators are waking up. The Netherlands imposed a moratorium on new data center construction in Amsterdam partly due to water and power concerns. Singapore has capped data center capacity. Ireland — Europe’s data center capital — has flagged grid and water constraints.
In the UK, the Environment Agency has warned that several regions face water stress. Thames Water supplies water to most of London’s data center cluster. Southern Water serves the growing Hampshire corridor. Both utilities are under severe financial and operational pressure.
Planning permissions for new facilities increasingly require Water Usage Effectiveness (WUE) commitments. But WUE, like PUE, only measures how efficiently you consume water — not whether you should be consuming it at all.
The Atmospheric Alternative
What if a data center could produce water instead of consuming it?
The atmosphere contains approximately 12,900 cubic kilometres of water vapour at any given time. Atmospheric water harvesting — extracting moisture from air — is well-established technology in arid regions for drinking water.
Project Saguaro’s Atmospheric Density Engine (ADE) takes this principle and integrates it directly into the data center cooling loop. By using subterranean convection driven by waste heat differentials, the system targets net water production — meaning the facility would discharge more clean water than it consumes.
At TRL 2-3, this is still in the validation phase. The critical unknowns include buoyancy budget calculations, pressure-loss profiles, and seasonal variability in atmospheric moisture content. But the thermodynamic basis is sound: where there is waste heat and atmospheric moisture, there is recoverable water.
What Happens If We Don’t Act
AI workloads are projected to consume 4.2-6.6 billion gallons of water globally by 2027 — and that’s a conservative estimate based on current growth trajectories. As GPT-scale models become standard enterprise infrastructure rather than research curiosities, every major cloud region will face water competition between data centers and communities.
The data center industry has two choices: fight for water allocations against hospitals, farms, and households — or develop infrastructure that doesn’t need water from the mains at all.
Project Saguaro is pursuing the second option.
Leave a Reply