ChatGPT promises to revolutionize online communication. However, behind its performance hides a high environmental cost. According to unpublished research by universities in the Colorado Riverside and Arlingto Texasn, ChatGPT training required 700,000 liters of fresh water just to keep the data center cool that hosts it: a record consumption. And in total, the consumption of the development of this AI reaches 3.5 million liters.
ChatGPT, the water consumption to train it is huge
The study took into account not only the water used for cooling the data center, but also that needed to generate the electricity that powers it. To measure this consumption, the authors used the WUE (Water Usage Effectiveness). That is, the liters of water used per year humidify and cool data centers divided by annual consumption in kWh of the plant (L/kWh).
The American scholars – who define themselves as the first to conduct an analysis of this type on the water impact of artificial intelligences – have estimated that if the ChatGPT training had been done outside the United States and with a WUE of about 3.8L /kWh (for example in Asia), the water consumption for cooling would have risen to 4.9 million litres. Equivalent to the production of 2,600 BMWs and 2,200 Teslas, for comparison.
So let’s talk about 700,000 liters of water spent on cooling the data center that trained ChatGPT. To which we must also add the 2.8 million liters needed for energy production consumed by the data center to function.
The total water used to train ChatGPT can therefore be estimated at 3.5 million liters of water with language model training in the USA. And of 4.9 million litres if this had happened in Asia.
Furthermore, they calculated that ChatGPT drinks 500 milliliters for a conversation between 20 and 50 interactions. And that LaMDA, Google’s AI, consumes an average of about one million liters of fresh water. You can find all the data in the University of Riverside research.
Leave a Reply
View Comments