Category: Industry Analysis

  • How AI Workloads Are Forcing a Complete Rethink of Data Center Design

    In 2023, a typical enterprise data center rack drew 7-10kW of power. By 2025, AI training racks routinely draw 40-70kW. NVIDIA’s GB200 NVL72 rack — the workhorse of large language model training — pulls 120kW.

    That’s not an incremental change. It’s a 10x increase in heat density per square foot. And the cooling infrastructure built for the cloud era simply cannot cope.

    The Thermal Wall

    Traditional air cooling works by pushing cold air through server racks and extracting the warmed air. At 7-10kW per rack, this is straightforward. At 40kW+, you physically cannot move enough air through the rack to prevent thermal throttling. The servers slow down to avoid damaging themselves.

    The industry’s response has been liquid cooling — pumping coolant directly to chips or immersing entire servers in dielectric fluid. Liquid cooling handles the density problem brilliantly. A rear-door heat exchanger or direct-to-chip cold plate can manage 100kW+ per rack without breaking a sweat.

    But liquid cooling creates two new problems:

    • Water consumption increases 3-5x per rack. The heat has to go somewhere, and the secondary rejection loop (from liquid to atmosphere) typically uses evaporative cooling towers that consume enormous volumes of water.
    • Waste heat concentration increases. Instead of diffuse warm air at 35°C, you have concentrated hot water at 45-60°C. More heat, in a more useful form — if you have a system that can capture it.

    The Opportunity in the Problem

    Here’s the counterintuitive insight: AI workloads are actually better suited to waste heat recovery than traditional compute.

    Traditional servers produce low-grade heat at 30-35°C — barely warm enough to drive any useful thermodynamic process. But AI training GPUs run at 70-80°C junction temperatures, producing waste heat at 45-60°C in the cooling loop. That’s a significantly more useful temperature differential.

    Project Saguaro’s THA system is specifically designed to operate on waste heat in the 40-60°C range. Higher heat densities from AI workloads mean more thermal energy per rack to recover — and at temperatures that improve the THA’s thermodynamic efficiency.

    In other words: the industry’s biggest thermal challenge is our system’s ideal operating condition.

    What This Means for Data Center Planning

    Operators planning new AI-capable facilities face a choice:

    1. Build conventional + liquid cooling: Handles the density, but locks in massive water consumption and grid dependency for the facility’s 15-20 year lifetime.
    2. Build for heat recovery from day one: Design the liquid cooling loop to feed waste heat into a recovery system rather than rejecting it to cooling towers.

    Option 2 doesn’t require waiting for Project Saguaro to reach TRL 9. It requires designing the plumbing and thermal architecture to be recovery-ready — hot water loops that can connect to a THA or ORC system when the technology is validated.

    The incremental cost of recovery-ready design is minimal compared to the cost of retrofitting a facility that was built to dump heat. Operators who plan for heat recovery today will have a significant competitive advantage when the technology matures.

    The Next 5 Years

    By 2030, AI workloads are projected to consume 3-4% of global electricity — up from roughly 1% today. That’s tens of gigawatts of additional heat that will be generated, concentrated, and (in almost all current plans) wasted.

    The operators who capture that heat and convert it into electricity and water will have facilities that are cheaper to run, easier to permit, and more attractive to ESG-conscious tenants.

    The operators who don’t will be running 2020-era infrastructure in a 2030 regulatory environment. That’s not a comfortable position to be in.

    Project Saguaro exists to make sure option 2 is available when the industry needs it. The validation work we’re doing now — THA component testing, ADE CFD simulation, integrated system modelling — is laying the engineering foundation for the next generation of data center infrastructure.

    Join the consortium to help shape what that generation looks like.

  • The Data Center Water Crisis Nobody Is Talking About

    In 2024, Google’s data centers consumed 6.1 billion gallons of water. Microsoft used 7.8 billion gallons. Meta’s facilities in Arizona and New Mexico drew water from aquifers that are already critically depleted.

    These numbers are growing at 20-30% year-on-year, driven almost entirely by AI training and inference workloads that generate significantly more heat per rack than traditional compute.

    Why Data Centers Need So Much Water

    Most large data centers use evaporative cooling — essentially, they spray water across heat exchangers to absorb waste heat through evaporation. It’s efficient at removing heat, but it’s fundamentally consumptive. The water doesn’t come back.

    A typical 100MW facility consumes 3-5 million gallons of water per day. That’s equivalent to the daily water consumption of a city of 50,000 people.

    Liquid cooling — often presented as the solution to AI-density thermal challenges — actually makes the problem worse. Direct-to-chip liquid cooling handles higher heat densities brilliantly, but it requires 3-5x more water per rack than traditional air cooling once you account for the secondary rejection loop.

    The Regulatory Squeeze

    Regulators are waking up. The Netherlands imposed a moratorium on new data center construction in Amsterdam partly due to water and power concerns. Singapore has capped data center capacity. Ireland — Europe’s data center capital — has flagged grid and water constraints.

    In the UK, the Environment Agency has warned that several regions face water stress. Thames Water supplies water to most of London’s data center cluster. Southern Water serves the growing Hampshire corridor. Both utilities are under severe financial and operational pressure.

    Planning permissions for new facilities increasingly require Water Usage Effectiveness (WUE) commitments. But WUE, like PUE, only measures how efficiently you consume water — not whether you should be consuming it at all.

    The Atmospheric Alternative

    What if a data center could produce water instead of consuming it?

    The atmosphere contains approximately 12,900 cubic kilometres of water vapour at any given time. Atmospheric water harvesting — extracting moisture from air — is well-established technology in arid regions for drinking water.

    Project Saguaro’s Atmospheric Density Engine (ADE) takes this principle and integrates it directly into the data center cooling loop. By using subterranean convection driven by waste heat differentials, the system targets net water production — meaning the facility would discharge more clean water than it consumes.

    At TRL 2-3, this is still in the validation phase. The critical unknowns include buoyancy budget calculations, pressure-loss profiles, and seasonal variability in atmospheric moisture content. But the thermodynamic basis is sound: where there is waste heat and atmospheric moisture, there is recoverable water.

    What Happens If We Don’t Act

    AI workloads are projected to consume 4.2-6.6 billion gallons of water globally by 2027 — and that’s a conservative estimate based on current growth trajectories. As GPT-scale models become standard enterprise infrastructure rather than research curiosities, every major cloud region will face water competition between data centers and communities.

    The data center industry has two choices: fight for water allocations against hospitals, farms, and households — or develop infrastructure that doesn’t need water from the mains at all.

    Project Saguaro is pursuing the second option.

  • Analysis: Towering South Asia: India’s digital leap

    Towering South Asia: India’s Digital Leap

    As I catch up on industry news, a statistic jumped out at me: **India is home to over 40% of the world’s data center capacity growth**. That got me thinking – what’s driving this rapid expansion? A recent article in Data Center Dynamics, “Towering South Asia: India’s Digital Leap”, shed light on the trends shaping India’s data center market.

    According to the article, India is shifting from serving telecom giants to hyperscalers, prioritizing space-efficient design and green power to meet growing demand beyond Tier 1. This shift is driven by a surge in cloud adoption, e-commerce growth, and increasing reliance on digital services.

    The Real Challenge

    As data center operators, we’re faced with the daunting task of meeting this exploding demand while minimizing our environmental footprint. The article highlights the pressure to adopt sustainable practices, from reducing energy consumption to utilizing green power sources. But what does this mean for us on the ground?

    Take space efficiency, for instance. As hyperscalers continue to drive growth, we need to optimize our designs to accommodate more capacity in existing facilities or build new ones that are inherently efficient. This means rethinking traditional approaches like water cooling and exploring innovative solutions like subterranean cooling.

    Our Approach

    At Project Saguaro, we’re tackling this challenge head-on by developing integrated solutions that reduce our carbon footprint while increasing operational efficiency. Our approach is centered around two key technologies: THA (waste heat recovery) and ADE (subterranean cooling). By combining these innovations, we aim to create a fully integrated symbiotic system that sets new standards for sustainability.

    We’re targeting net-positive water production (**300k L/day**) and **95% grid independence**, though these need validation through our consortium’s rigorous testing and validation process. What makes us different is our commitment to a holistic approach, unlike some competitors who focus on heat-to-power or water recycling separately.

    Join Us

    If you’re as excited about the potential for net-positive data centers as we are, join the conversation! Learn more about Project Saguaro and our consortium’s efforts to validate innovative solutions that can transform the industry. Visit our website or reach out to us at consortium@netpositivedatacenters.org. Let’s work together to create a more sustainable future for data centers.