Some AI Prompts Could Cause 50 Times More CO2 Emissions than Others, Researchers Find
GERMANY, JUN 19 – A study by German researchers shows reasoning AI models emit up to 50 times more CO2 than concise models, highlighting a trade-off between accuracy and environmental impact.
- Researchers in Germany published a study on June 19 measuring CO2 emissions of 14 trained large language models using 1,000 standardized questions across diverse subjects.
- The study revealed that large language models equipped with reasoning functions, like the 70-billion-parameter Cogito model, generate carbon emissions at levels vastly higher—up to dozens of times greater—than models designed to produce brief answers, highlighting a balance between model accuracy and environmental impact.
- The Cogito model achieved 84.9% accuracy by generating significantly more tokens and requiring longer computation times, causing higher carbon emissions, especially on complex questions like abstract algebra and philosophy.
- Maximilian Dauner emphasized that users can lower carbon emissions by requesting brief responses from AI or reserving powerful models for situations that truly need their capabilities, while also noting that emissions depend on the specific hardware and regional energy sources involved.
- The findings suggest the need for mandatory environmental reporting and regulation of AI energy use, while quantum AI offers promise to reduce emissions as data centers continue consuming large amounts of electricity globally.
32 Articles
32 Articles
The Environmental Cost of AI's Reasoning Power | Science-Environment
A study reveals chat-based AI generates up to six times higher carbon emissions with complex prompts than simpler ones. Large-language models, like DeepSeek and Cogito, produce 50 times more emissions with reasoning processes. This highlights a trade-off between AI accuracy and sustainability.
Balancing accuracy and emissions: the climate cost of your AI
Every time you ask an artificial intelligence a question, there’s a surprising cost: carbon emissions. Before an AI like ChatGPT can respond, it first breaks down your input into “tokens” — small chunks of text such as words, parts of words, or punctuation. These tokens are turned into numbers the model can process using billions of internal settings called parameters, which help it recognise patterns, make connections and predict what comes nex…


Thinking AI models emit 50x more CO2—and often for nothing
Every query typed into a large language model (LLM), such as ChatGPT, requires energy and produces CO2 emissions. Emissions, however, depend on the model, the subject matter, and the user. Researchers have now compared 14 models and found that complex answers cause more emissions than simple answers, and that models that provide more accurate answers produce more emissions. Users can, however, to an extent, control the amount of CO2 emissions ca…
Coverage Details
Bias Distribution
- 53% of the sources are Center
To view factuality data please Upgrade to Premium