Lately, anytime ChatGPT or AI (artificial intelligence) is brought up, you hear about the environmental impact.
It’s often in undefined terms, “it uses so much water.” But how much water is it really using? And what is it
using it for? The answer lies in data centres.
Data centres have become the backbone of the digital age, powering everything from Netflix streams to AI
chatbots. But powering these services requires immense computing power, creating extreme amounts of heat.
If you’ve built a computer before, you know that managing temperature is vital for things to run smoothly.
Cooling the data centres requires vast amounts of water and fans, straining water resources and using large
amounts of energy.
ChatGPT and its rapid growth have made the problem much worse. In their series “Power Grab,” which
investigates the impact of AI infrastructure on the environment, the Washington Post found that Open AI’s
ChatGPT-4 model uses 519 millilitres of water to generate a 100-word email, a little more than a bottle of
water. Recent data shows that ChatGPT has 123.5 million daily users, that’s a lot of water bottles.
Data centres require freshwater sources for cooling and humidity control, making the water bottle a perfect
example of the essential resources used to maintain these services.
But, even before users can get their hands on a chatbot, water and energy usage are necessary for language
model training. Months are spent training large language models such as ChatGPT, which is another strain.
The Washington Post reported that Microsoft used 700,000 litres of water training ChatGPT-3.
There are methods to try and eliminate some of the waste, such as closed-loop cooling, this is exactly what it
sounds like. Water will circulate through the system and be reused, and a “heat exchanger” will dissipate the
heat and blow it outside of the cooling system. However, similar to a fan-based cooling system, these systems
use a lot of energy.
The tech companies that use these data centres are pledging to minimize their eco-footprint, but these pledges
are hardly upheld. Google’s carbon emissions actually rose by 48 per cent since 2019 according to a 2024
report.
Another solution is water replenishment. The companies can work with local communities to replenish water
back into their water supplies. Google pledged to replenish 120 per cent of water used by 2030 but has only
replenished 18 per cent, according to the Washington Post. Altogether, companies such as Google and
Microsoft use billions of gallons of water.
As an admitted AI user, this information is extremely upsetting and worth bringing up anytime we talk about AI.
It’s important to note that data centres, while using more energy and water due to AI, also use water and
energy for other services such as server maintenance and cloud storage. The problem does not go away with
AI, even if conversations about AI’s environmental impact shed light on the problem.
Like many green initiatives, personal use is often emphasized as the best solution. It is, of course, the one
within your control.
I propose a balance. We can control how much or how little we use services hosted by high-consuming data
centres, we can demand better and hold massive companies accountable to their pledges. There are
alternative solutions such as using industrial fans instead of water for cooling, building data centres in cold
climates, using different sources of water and being more active in water replenishment, to name a few. Data
centres can work to be greener.
We should collectively share the burden of how our tech is impacting the environment and the communities
around us, we can’t leave corporate responsibility out of the conversation or ignore personal responsibility, it
should be both.