A study done by Epoch AI has estimated that the power consumption of a single Chatgpt-4o query is significantly lower at 0.3Wh instead of the previously estimated 3Wh. This reduction is due to more efficient models and GPUs compared to 2023 and overestimated token counts in previous estimates.
Comparisons to Everyday Energy Usage
0.3Wh is less than what an LED bulb consumes or a laptop in a few minutes. Even heavy Chatgpt users consume only a tiny fraction of what at average US household consumes in a day(28,000Wh). Energy use varies, longer inputs and complex queries consume more energy.
How ChatGPT’s Energy Use is Estimated
Chatbots run on large language models(LLMs) requiring compute power, which translates to energy use. GPT-4o is the reference model in this study with only a portion of its 200B parameters activated. The Nvidia H100 GPU is assumed for inference, with power consumption of ~1050 watt-seconds (0.3 watt-hours) per query. Lower efficiency (10% utilization rate) is accounted for, which increases energy use estimates conservatively.
Energy Costs for Longer Inputs
Simple queries consume about 0.3Wh. Longer inputs at 10,000 tokens (a short paper) consumes roughly 2.5Wh while even longer inputs at 100,000 tokens (200 pages of text) consumes about 40Wh. Follow-up queries do not demand the same energy use as the original query.
Energy Use of Other AI Models
GPT4o mini likely uses less energy. o1 and o3 models (reasoning-focused) consume more energy due to longer responses. o3-mini is expected to replace GPT-4o in some cases, but its energy use remains unclear. Other AI models (Claude 3.5, Gemini, Meta AI) likely have similar energy costs to GPT-4o.
Future Energy Trends
More efficient hardware and models will likely further bring down energy use per query. Utilizing more complex models may ultimately lead to an overall increase in energy consumption. With some estimates that AI will consume up to 10% of US electricity by 2030, sparking concerns about the impact AI has on the climate.
Collectively, the carbon footprint of AI is significant and we have to be wary about the impacts AI use can have on the environment. The solution would seem to power data centers with clean energy tech such as solar, wind or even nuclear fusion.
Clamping down the use of AI will not be as feasible as an option as this limits humanity’s potential creativity. The benefits that AI will revolutionarily have on the economy to a certain extent, justifies it’s carbon footprint. Nonetheless, the way to go is clean energy.
Conclusion
GPT4o queries are much less energy intensive as previously thought, at just 0.3Wh per query. Longer inputs increases energy consumption but most users will not experience this. AI energy use is currently modest but could increase significantly with more innovative models and higher uptake. More transparency from AI companies would help provide more accurate estimates.