Amid a fierce debate concerning the environmental toll of synthetic intelligence, Google launched a brand new study that claims its Gemini AI assistant solely makes use of a minimal quantity of water and power for every textual content immediate. However specialists say that the tech large’s claims are deceptive.
Google estimates {that a} median Gemini textual content immediate makes use of up about 5 drops of water, or 0.26 milliliters, and about as a lot electrical energy as watching TV for lower than 9 seconds, roughly 0.24 watt-hours (Wh), which produces round 0.03 grams of carbon dioxide emissions.
Google’s estimates are decrease than earlier analysis on water- and energy-intensive knowledge facilities that undergird generative AI fashions. That’s due partly to enhancements in effectivity that the corporate has revamped the previous yr. However Google additionally disregarded key knowledge factors in its research, resulting in an incomplete understanding of Gemini’s environmental impression, specialists inform The Verge.
“They’re simply hiding the crucial info.”
“They’re simply hiding the crucial info,” says Shaolei Ren, an affiliate professor {of electrical} and laptop engineering on the College of California, Riverside. “This actually spreads the unsuitable message to the world.” Ren has studied the water consumption and air pollution related to AI, and is likely one of the authors of a paper Google mentions in its Gemini research.
A giant problem specialists flagged is that Google omits oblique water use in its estimates. Its research included water that knowledge facilities use in cooling programs to maintain servers from overheating. These cooling programs have sparked concerns for years about how knowledge facilities would possibly exacerbate water shortages in drought-prone areas. Now, consideration is shifting to how rather more electricity data centers might need to accommodate new AI fashions. Rising electrical energy demand has triggered a spate of latest plans to build gas and nuclear power plants, which additionally consume water in their own cooling systems and to turn turbines using steam. Actually, a majority of the water a data center consumes stems from its electricity use — which Google overlooks on this research.
Because of this, with Google’s water estimate, “You solely see the tip of the iceberg, principally,” says Alex de Vries-Gao, founding father of the web site Digiconomist and a PhD candidate at Vrije Universiteit Amsterdam Institute for Environmental Research who has studied the power demand of knowledge facilities used for cryptomining and AI.
Google disregarded one other vital metric in relation to energy consumption and air pollution. The paper shares solely a “market-based” measure of carbon emissions, which takes into consideration commitments an organization makes to assist renewable power progress on energy grids.
A extra holistic method can be to additionally embrace a “location-based” measure of carbon emissions, which considers the impression {that a} knowledge heart has wherever it operates by considering the present combine of unpolluted and soiled power of the native energy grid. Location-based emissions are typically higher than market-based emissions, and provide extra perception into an organization’s native environmental impression. “That is the groundtruth,” Ren says. Each Ren and de Vries-Gao say that Google ought to have included the location-based metric, following internationally acknowledged standards set by the Greenhouse Gas Protocol.
Google’s paper cites earlier analysis carried out by Ren and de Vries-Gao and argues that it may present a extra correct illustration of environmental impression than different research primarily based on modeling that lack first-party knowledge. However Ren and de Vries-Gao say that Google is making an apples-to-oranges comparability. Earlier work was primarily based on averages fairly than the median that Google makes use of, and Ren faults Google for not sharing numbers (phrase rely or tokens for textual content prompts) for the way it arrived on the median. The corporate writes that it bases its estimates on a median immediate to stop outliers that use inordinately extra power from skewing outcomes.
“You solely see the tip of the iceberg, principally.”
In the case of calculating water consumption, Google says its discovering of .26ml of water per textual content immediate is “orders of magnitude lower than earlier estimates” that reached as excessive as 50ml in Ren’s analysis. That’s a deceptive comparability, Ren contends, once more as a result of the paper Ren co-authored takes into consideration a knowledge heart’s complete direct and oblique water consumption.
Google has but to submit its new paper for peer overview, though spokesperson Mara Harris stated in an e mail that it’s open to doing so sooner or later. The corporate declined to reply on the document to an inventory of different questions from The Verge. However the research and accompanying blogs say that Google desires to be extra clear concerning the water consumption, power use, and carbon emissions of its AI chatbot and provide extra standardized parameters for find out how to measure environmental impression. The corporate claims that it goes additional than earlier research by factoring within the power utilized by idling machines and supporting infrastructure at a knowledge heart, like cooling programs.
“Whereas we’re pleased with the innovation behind our effectivity good points to date, we’re dedicated to persevering with substantial enhancements within the years forward,” Amin Vahdat, VP/GM of AI & Infrastructure for Google Cloud, and Jeff Dean, chief scientist of Google DeepMind and Google Analysis, say in a blog.
Google claims to have considerably improved the power effectivity of a Gemini textual content immediate between Might 2024 and Might 2025, attaining a 33x discount in electrical energy consumption per immediate. The corporate says that the carbon footprint of a median immediate fell by 44x over the identical time interval. These good points additionally clarify why Google’s estimates are far decrease now than research from earlier years.
Zoom out, nevertheless, and the true image is extra grim. Effectivity good points can nonetheless result in extra air pollution and extra sources getting used general — an unlucky phenomenon known as Jevons paradox. Google’s so-called “ambitions-based carbon emissions” grew 11 p.c final yr and 51 p.c since 2019 as the corporate continues to aggressively pursue AI, in line with its newest sustainability report. (The report additionally notes that Google began excluding sure classes of greenhouse fuel emissions from its local weather objectives this yr, which it says are “peripheral” or out of the corporate’s direct management.)
“When you have a look at the entire numbers that Google is posting, it’s truly actually unhealthy,” de Vries-Gao says. In the case of the estimates it launched right this moment on Gemini, “this isn’t telling the whole story.”