Posted inEmergent Tech

AI models like ChatGPT, Google Bard are very ‘thirsty’, reveals study

The water used by Microsoft to cool its data centres for GPT-3 training could have produced “370 BMW cars or 320 Tesla electric vehicles.”

A recent study revealed that training artificial intelligence models such as OpenAI’s ChatGPT and Google’s Bard consume “extremely large” amounts of water.

According to researchers from the Universities of Colorado Riverside and Texas Arlington, massive volumes of water go into keeping the data centres for such AI tools cool.

Their study revealed that Microsoft, in collaboration with OpenAI, consumed an astonishing 185,000 gallons of water to train GPT-3, which is equivalent to the amount of water required for cooling a nuclear reactor. According to the paper, the water used by Microsoft in the United States to cool its data centres for GPT-3 training could have produced “370 BMW cars or 320 Tesla electric vehicles.” Furthermore, if the training had taken place in Microsoft’s more extensive data centers in Asia, the water consumption would have tripled.

Additionally, the report expresses apprehension regarding the water usage associated with AI inference, exemplified by ChatGPT, which necessitates the equivalent of a 500 ml water bottle for a straightforward 20-50 question and answer conversation. Given the billions of ChatGPT users, the paper acknowledges that the total water footprint for inference is exceedingly substantial.

The study suggests that if the data had been generated in less energy-efficient data centres located in Asia, water usage could have tripled. The study also reveals that the number may increase with the recently introduced GPT-4 AI system, which has a larger model size.

The researchers stated that the water footprint of AI models can no longer be overlooked and that addressing the water footprint must be made a priority as part of collective efforts to combat global water challenges.

Taking responsibility

The report called on companies and researchers to investigate different techniques to decrease the water usage of AI training and inference in light of these concerns. This could entail improving data centre cooling mechanisms, designing algorithms that consume less computational resources, and utilising renewable energy sources to fuel data centres. Additionally, recycling and reusing water, as well as employing water-efficient cooling technologies, could be explored to lessen the impact of AI on water resources.

It also highlighted how collaborations between AI developers, environmental researchers, and policymakers can provide further prospects for creating sustainable AI development guidelines and standards. This may involve integrating water usage into the evaluation of AI technologies’ environmental impact and promoting transparency and accountability in disclosing water consumption related to AI training and inference.