It is no secret that ChatGPT is taking over the life of the layman.

Whether it’s used for writing essays, job applications or emails, or, more concerningly, for providing reassurance to an anxiety riddled brain, its ubiquity is undeniable.

Yet equally pervasive is the very fact that every search harms the environment. ChatGPT, which is based on the GPT-3 model, consumes a significant amount of energy during its training and operational phases. The energy consumed in the training of GPT-3 amounted to 1,287 megawatt-hours (MWh), resulting in more than 550 tonnes (t) of CO₂ emissions associated with this process.

ChatGPT contributes to energy consumption

In addition to the training phase, the operational use of ChatGPT also contributes to energy consumption. A ChatGPT-like application, which is estimated to handle around 11 million requests per hour, produces approximately 12.8t of CO₂ per year.

Furthermore, the energy requirements for running AI applications like ChatGPT are expected to increase significantly. Current rack power levels are around 15 kW or less, but as AI demands grow, this could rise to as much as 100 kW in the near future. In summation, the use of ChatGPT has enormous implications for the environment’s CO₂ levels.

The danger of politeness

Research has begun into exactly the kind of prompt that takes up the most energy with ChatGPT, and what the most harmful thing you can ask is. In Germany, researchers ran 14 open-source large language models (LLMS) through 1,000 benchmark questions.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

Measuring the CO₂ emissions from each response. Models which used internal reasoning to ‘think’ through answers could potentially produce 50 times more emissions that those which responded concisely.

Questions related to philosophy or abstract algebra, which required deeper reasoning, produced far more emissions than topics like history, which pertain to more factual things. Models with more parameters given also produced more emissions.

However, perhaps most intriguingly, the increase of politeness increases emissions. When users are friendly as say ‘please’ and ‘thank you’ to ChatGPT, more words are generated. This means that the model is executed for longer, requiring more power and emitting more emissions. More interestingly, the additional words actually have little impact on how useful or accurate an answer is. However, the impact on the environment is huge.

The ChatGPT bestie

These facts become concerning when we assess how younger generations are increasingly using ChatGPT as a kind of online confidante. Whether it’s late-night overthinking, navigating the stresses or school, university and work, or even how to get over a situationship, Gen Z are turning to ChatGPT for reassurance. Yet as their relationship with the chatbot becomes more intimate and colloquial, so too does their language.

Ultimately the concern arises as they thank ChatGPT for its service, or ask it to ‘please’ help them, given that research demonstrates this uses more energy and emits more Co₂.

Gen Z may be environmentally conscious, yet their intimacy with the digital world is now proving to be harmful to the planet. As Gen Z increasingly lean on ChatGPT for emotional support and companionship, the environmental cost of constant AI usage often goes unnoticed. While digital conversations may feel weightless, every chat leaves a carbon footprint. Even virtual friendships have real-world consequences.