Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For Google's Gemini LLM, the energy impact is negligible, with the average prompt consuming the equivalent energy of just three seconds of a microwave's operation.


All those data centers full of GPUs aren’t running on solar or wind power.


I did a bit of research on the environmental impact with regards to the United States. Recent numbers suggest that ChatGPT handles about 2.5 billion prompts per day worldwide, with roughly 330 million of those coming from the United States. Since the U.S. population is about 335 million, that works out to about one prompt per person per day on average, though actual users issue several times more.

On the energy side, Google recently estimated that an average Gemini inference consumes around 0.24 Wh, which is roughly the same as running a microwave for a single second. Older rule-of-thumb comparisons put the figure closer to 3–6 seconds of microwave use, or about 0.8–1.7 Wh per prompt. If you apply those numbers to U.S. usage, you get somewhere between 79 MWh and 550 MWh per day nationally, which translates to only a few to a few dozen megawatts of continuous load. Spread across the population, that works out to between 0.09 and 0.6 kWh per person per year — just pennies worth of electricity, comparable to a few minutes of running a clothes dryer. The bigger concern for the grid is not individual prompts but the growth of AI data centers and the energy cost of training ever-larger models.


How much energy is otherwise consumed by a Google search?


I’m not entirely sure, since it seems that a very slimmed-down version of Gemini has been attached to search. It’s definitely not the full Gemini 2.5-Pro that engineers use to carefully reason through answers. Instead, it relies mostly on tool calling to stitch together a response.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: