AI boom raises concerns about water consumption by Big Tech

Water cooling in data centres has increased dramatically for the world’s largest technology companies, causing concern over the impact on the environment of the artificial intelligence boom.

Microsoft, Google, and Meta have increased their water consumption in recent years. Millions of users are hooked on their online service.

The academics believe that AI demands will increase water withdrawal – where water is taken from surface or ground sources – to between 4.2bn m3 and 6.6bn m3 by 2027. This would be about half of the water consumption in the UK every year.

In a article cited by Nature, researchers from the University of California at Riverside wrote that “it is a critical time to uncover and tackle AI models’ hidden water footprint in the face of the increasing freshwater scarcity crises, the worsening of extended droughts, and the rapidly aging public water infrastructure”.

The concern about this has increased over the last year, as tech companies have competed to release products using generative AI. These models are based on large language models that can process and generate massive amounts of text, numbers, and other data.

These models are powered by huge servers that cool equipment using chilled water. They do this by absorbing the heat in the air. Some water is lost in the cooling process while others can be recycled.

In most fuel and electricity generation processes, water is used, for instance, to pump oil and gas, or to create steam in thermal power plants. The water evaporates off the surface of hydroelectric reservoirs.

Microsoft’s water consumption increased by 34 percent in 2022 (the latest year for which figures are available), Google’s by 22 percent, and Meta’s by 3 cents as a result their increasing use of data centers.

By 2030, these companies aim to restore more water to systems like aquifers, or improve irrigation infrastructure.

According to a lawsuit, residents of West Des Moines in Iowa filed a month before OpenAI completed training GPT-4. A data centre cluster there consumed 6 percent of the district’s drinking water.

Shaolei Ren is an associate professor from UC Riverside. She has stated that requesting 10 to 50 responses using the popular ChatGPT Chatbot, running on GPT-3, would be equivalent to drinking a 500ml water bottle, depending on where and when it was deployed.

Ren said that GPT-4 was more complex and needed more power. It would therefore use more water. The model has not provided detailed information on its energy consumption.

Researchers have called on AI firms to provide more data and transparency, such as a breakdown for how much computing power different services, like search engines, consume.

Open AI responded to a question by saying: “We are aware that training large models is water-intensive. This is why we work constantly to improve efficiency.” We also believe that big language models will help accelerate scientific collaboration and the discovery of climate solutions.

Microsoft stated that, “currently AI computing accounts for only a small fraction of the energy used by datacenters which use collectively about 1 percent of global electricity supply.” The factors that will determine how much AI grows and whether it has an impact on the race to achieve net zero in the world are many.

Google has declined to comment.

Kate Crawford, research professor at USC Annenberg, who specialises on the social impacts of AI, stated: “Without more transparency and reporting, it is impossible to track the true environmental impacts of AI model.

This is important at a time of deep and prolonged droughts in many parts on the planet, and when fresh drinking water has already become a scarce commodity.

She said: “We do not want to blindly use generative AI without knowing their true impact at a moment when the world is already facing climate change.”