Google shares how much energy is used for new Gemini AI prompts
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Lindsey Bailey/Axios
Google on Thursday unveiled measurements of energy, water use and emissions from text prompts using its Gemini Apps AI assistant — and it's calling for greater industry consistency in tallying AI's environmental effects.
Why it matters: The artificial intelligence boom is bringing a surge in power-thirsty data centers, but the energy needs and climate footprint remain a moving and often hazy target.
- Google's overall findings are "substantially lower than many public estimates," it said.
Driving the news: The tech giant released a detailed methodology that encompasses real-world electricity and water use from deploying AI at scale.
- That includes, for instance, energy used by idle chips and data center "overhead" — that is, equipment such as cooling that's not directly running AI workloads.
- Those are two of many factors covered in assessing the "full stack" of AI infrastructure, Google's new paper and blog posts explain.
What they found: The median energy use of a single text prompt on Gemini is equivalent to watching TV for less than nine seconds and consumes about five drops of water, the paper finds.
- It emits 0.03 grams of carbon dioxide equivalent, Google said.
- Better software efficiency and clean energy have lowered the median energy consumption of a Gemini Apps text prompt by 33x over the last year, and the CO2 by 44x, the company said.
"As the adage goes, with great power comes great responsibility," Partha Ranganathan, a Google VP and engineering fellow, told reporters this week.
- "With great computing power comes great environmental responsibility as well, and so we've been very thoughtful about the environmental impact of this increasing computing demand caused by AI," he said.
Yes, but: The new analysis of text prompts doesn't cover video or image generation queries.
- Savannah Goodman, Google's head of advanced energy labs, said it's continuously looking to improve transparency. But there's been little consensus on how to measure the impact of even text generation, she said.
- "That's really the most consistent request we've gotten. And so we're really starting there with this paper," she told reporters.
- The paper also doesn't apply the new methodology to the training of AI models — a big part of the energy puzzle, though Google has done other research on this.
The big picture: Gains in per-query efficiency come as overall AI use is rapidly expanding, and data center energy demand along with it.
- Estimates vary. For instance, a late 2024 DOE report projects that data centers could account for 6.7% to 12% of U.S. electricity use by 2028.
- Google, in a recent report, said its data center energy emissions fell by 12% in 2024.
- But the company's overall emissions were up 11% amid increases in greenhouse gases from its supply chain, including manufacturing and assembling AI computing hardware, and building data centers.
What we're watching: Goodman hopes the analysis of "all of the critical factors" will help the industry overall.
- "We think that this full picture provides the most accurate view of AI's overall footprint, and by openly sharing this methodology and results, we're hoping to foster greater industry-wide consistency in calculating the impact of AI."
Sign up here to receive the free daily Generate newsletter.
