
Recent studies reveal that artificial intelligence systems, particularly those using OpenAI’s GPT-3, consume significant amounts of water during user interactions. Each short conversation with the AI can require up to 500 milliliters of water, similar to a standard water bottle. This figure encompasses not just the water used for cooling data center servers but also the water consumed at power plants that generate the electricity necessary for their operation.
Understanding the water consumption associated with AI is crucial, as highlighted by Leo S. Lo, Dean of Libraries and Professor of Education at the University of Virginia. He emphasizes that grasping AI’s impact involves more than just knowing how to engage with it; it requires awareness of the infrastructure and the environmental trade-offs that accompany its use.
Understanding AI’s Water Footprint
The analysis of AI’s water usage reveals two primary streams of consumption. The first involves the on-site cooling of servers, which generate substantial heat. Many data centers utilize evaporative cooling systems, where water is sprayed over heated components. This method effectively cools the servers but draws water from local sources, such as rivers and aquifers.
The second stream consists of the water used by power plants, including coal, gas, and nuclear facilities, which require large volumes of water for cooling and steam cycles. Even renewable sources like hydropower utilize significant water resources, as evaporation occurs from reservoirs.
Location plays a vital role in determining how much water is used. For instance, data centers situated in cool, humid areas, such as Ireland, may rely less on water-intensive cooling methods. Conversely, facilities in Arizona during peak summer months often rely heavily on evaporative cooling, resulting in greater water consumption due to the local climate.
Innovative Solutions and the Future of Cooling
Recent advancements suggest promising alternatives to traditional cooling methods. Techniques such as immersion cooling involve submerging servers in non-conductive fluids, thereby minimizing water evaporation. Furthermore, a new cooling design from Microsoft purports to use zero water by circulating a specialized liquid within sealed pipes that absorb heat without evaporation.
Despite these innovations, widespread adoption remains limited due to cost and the complexity of retrofitting existing data centers. Currently, many operators still rely on evaporative cooling systems, which are more familiar and widely used.
The type of AI model also impacts water usage, with some models consuming significantly more resources than others. Studies indicate that certain models may utilize over 70 times more energy and water than more efficient alternatives.
To calculate AI’s water footprint, users can follow a simple three-step method. First, find credible research estimates regarding energy consumption for different AI models. For example, a medium-length response from GPT-5 consumes approximately 19.3 watt-hours, while GPT-4o uses about 1.75 watt-hours.
Next, apply a practical estimate for water consumption per unit of electricity, which ranges from 1.3 to 2.0 milliliters per watt-hour. Finally, multiply the energy usage by the water factor to determine the total water footprint.
For instance, a medium-length query to GPT-5 would yield a water usage of approximately 39 milliliters. In contrast, a similar query to GPT-4o would consume about 3.5 milliliters.
Contextualizing AI’s Water Use
The cumulative effect of AI queries on water consumption is significant. OpenAI reports handling around 2.5 billion prompts daily across its various models, including GPT-4o and GPT-5. To put this into perspective, Americans use about 34 billion liters of water daily for residential purposes, such as watering lawns.
While generative AI currently consumes a relatively small amount of water compared to other common uses, its demand is not static. Transparency in reporting water usage allows for better comparisons among AI providers and aids policymakers in making informed decisions.
Efforts to reduce water consumption include optimizing AI systems through specialized chips, efficient cooling methods, and strategic data center placement in cooler, wetter regions. Such measures can mitigate the environmental impact of AI, ensuring a balance between technological innovation and sustainability.
As the conversation around AI’s ecological footprint continues to evolve, it is imperative for stakeholders across sectors to remain informed and proactive. By understanding the intricacies of AI’s water usage and engaging in transparent practices, society can better navigate the challenges posed by rapid technological advancement.