There’s a seductive narrative forming around artificial intelligence. It’s the image of a limitless, clean digital world where information is generated effortlessly and instantaneously within seconds. Because these tools exist "in the cloud," many assume they float above the mess of real-world resource consumption. The myth suggests that AI is sustainable, immaterial, and essentially free to use—economically, ethically, and environmentally. Veiled beneath the harmless facade lies a dangerous truth of consumption that most are unaware of - which is the consumption of water.
The issue is, artificial intelligence, the enormous computational power behind modern AI systems, is grounded in massive, physical infrastructure—data centers packed with servers that hum around the clock. Shaolei Ren, a computer engineering professor of UC Riverside states that heat produced from data centers must be dissipated to the external environment to prevent the system from overheating. A massive cooling system is required to move the heat from the data centers to the heat exchanger, and from there, a facility-level cooling from the exchangers into the outside environment. It is this process of facility-level cooling in which water is consumed What rarely enters public discussion is how much water is actually required to keep AI running. Every chatbot reply and every AI-generated image comes with a hidden cost: billions of gallons of freshwater diverted to keep the machines cool. Ren describes that if 10-50 questions are asked and answered from ChatGPT, around 500 milliliters of water - the standard bottle size - will be consumed. As demand for AI tools explodes, this cost is climbing fast, often in places that can least afford the strain.
In the last few years, the water consumption of tech giants has surged—largely due to AI infrastructure expansion. In 2022, Google reported using over 5 billion gallons of water to cool its data centers—equivalent to irrigating 37 golf courses annually—a staggering 20% jump from the year before. As AI systems grow larger and more complex, they require exponentially more energy and cooling, and that means more water. At the current pace, this isn't scaling sustainably. More importantly, this is happening without public accountability or transparency. We marvel at the performance of these systems without pausing to ask what went into building them—or what’s being quietly depleted to keep them running. The irony is that AI infrastructure is directly contributing to the very climate stress that future AI models claim to help solve. We’re told AI will help fight drought, model water shortages, and support climate action. But at the moment, it’s part of the problem.
Artificial intelligence doesn’t have to be environmentally reckless. But the first step is breaking the illusion that it’s free—in any sense of the word. We need greater transparency from tech companies on water and energy usage, not just in vague corporate sustainability reports but in real-time, location-specific disclosures. Second, innovation must shift from scale-at-all-costs toward efficiency. Placing data centers in cooler climates with abundant renewable energy—like Iceland or Scandinavia—should be prioritized, not avoided for profit margins. Finally, there needs to be regulation. Just as we’ve set environmental standards for cars, appliances, and buildings, it’s time we apply the same lens to AI systems. “Smarter” should mean more than faster and larger—it should mean being conscious of its footprint.