Data Center Water Consumption: AI Uses More Water Than Entire Cities
- lindenfelder
- 6 days ago
- 4 min read
A single Meta data center in Newton County, Georgia consumes 500,000 gallons of water per day, roughly 10% of the entire county's supply. In Iowa, one facility consumed a billion gallons in 2024, enough to cover the state's residential water needs for five days. These are not outliers. Large data centers routinely consume as much water as towns of 10,000 to 50,000 people, and the AI boom is making the problem dramatically worse.
For scale, consider London. The city's nine million residents use about 2.6 billion liters of water per day, or roughly 949 billion liters per year. The International Energy Agency estimates global data centers consumed approximately 560 billion liters in 2023, already more than half of London's total. By 2030, that figure is projected to exceed 1.2 trillion liters, surpassing London entirely. For anyone tracking corporate carbon footprints (the total greenhouse gas emissions from a company's operations and supply chain), this is a blind spot. The same goes for companies working toward net zero (balancing remaining emissions with equivalent removals).
Why Data Centers Need So Much Water
Data center water consumption falls into two categories of water use: direct and indirect. Direct consumption comes from cooling. Servers generate enormous heat, and the most common method for managing it, evaporative cooling, pushes warm air through water-soaked pads. Around 80% of the water drawn in for cooling evaporates and never returns to the local water system.
Indirect consumption is even larger. The IEA estimates that roughly 60% of total data center water use comes from the electricity generation that powers these facilities. A 100-megawatt data center, which draws more power than 75,000 homes, consumes about two million liters of water per day. In Phoenix, Arizona, where data centers already account for 7.4% of state power consumption, Meta's facility in Goodyear uses 56 million gallons of potable water annually. Texas faces an even steeper trajectory: a study by the Houston Advanced Research Center projects the state's data centers will consume 49 billion gallons of water in 2025, potentially rising to 399 billion gallons by 2030. To put that in London terms, the Texas projection alone would be equivalent to roughly 40% of London's annual water supply.
The AI Factor
Before generative AI entered the mainstream, the biggest cloud computing companies pledged to be "water positive" by 2030, meaning they would replenish more water than they consume. Those commitments are now under serious strain.
AI workloads are far more resource-intensive than traditional computing. Research from the University of California, Riverside estimates that each 100-word AI prompt uses roughly half a liter of water. A December 2025 study published in the journal Patterns estimates that AI systems alone could be responsible for 312 to 765 billion liters of water consumption annually. At the upper bound, that is more water than the IEA attributed to the entire global data center industry in 2023, and that is from AI alone, not the full picture.
Where these data centers are being built makes the problem worse. Two-thirds of new data centers built since 2022 are located in regions already classified as water-stressed. In Northern Virginia, the world's densest data center cluster, more than 300 facilities operate in just a handful of counties, with dozens more in development. OpenAI's planned 1.2-gigawatt campus in Abilene, Texas sits in a region already grappling with water scarcity.
Carbon and Water: A Trade-Off Most Companies Ignore
Here is the part most climate strategies miss. Data centers that use less water typically require more energy-intensive air cooling, which increases Scope 2 emissions (indirect emissions from purchased electricity). Conversely, water-cooled facilities emit roughly 10% less carbon but place immense pressure on freshwater supplies. This creates a direct tension between decarbonization goals and water stewardship, one that carbon offsets (credits purchased to compensate for emissions elsewhere) alone cannot resolve.
The IEA estimates that data centers produced approximately 182 million tons of CO2 in 2024, representing about 1% of global energy-related emissions. A 2024 study of 2,132 U.S. data centers found their average carbon intensity was 48% higher than the national average across all economic sectors. The reason: many facilities are located in areas where the local electricity grid still runs primarily on coal and natural gas, meaning every kilowatt-hour they draw carries a heavier carbon load.
For companies accounting for their value chain emissions, the water and carbon footprint of cloud infrastructure belongs squarely within Scope 3 emissions (all indirect emissions across a company's supply chain) reporting, yet most organizations still do not track it.
Key Takeaway
Data center water consumption is climbing fast and colliding with water scarcity in the very regions where facilities are being built, from the deserts of Arizona to the aquifers of rural Georgia. When a single industry is on pace to outdrink London by the end of this decade, the environmental conversation needs to expand beyond carbon. Companies serious about climate integrity need to start accounting for water as a core part of their environmental strategy.


