AI chatbots: Understanding their high energy consumption

Publié le 14 September 2025 à 16h02
modifié le 14 September 2025 à 16h03

AI chatbots raise deep questions about their energy consumption. Indeed, the operation of these technologies involves an enormous energy demand. Data centers, essential for AI processing, now account for 1.5% of global consumption. This phenomenon is likely to worsen, with AI projected to become a major player in overall electricity consumption.

Language models require immense power. By training complex systems, energy consumption skyrockets. *Environmental issues will be critical* in the face of this reality. Users of these technologies must understand the implications of their use in order to promote *energy transparency* that is essential for the future.

Energy consumption of AI chatbots

The growing demand for artificial intelligence chatbots leads to unprecedented energy consumption. In 2023, data centers responsible for processing AI accounted for 4.4% of electricity use in the United States. Globally, these infrastructures consume about 1.5% of total energy. This phenomenon is expected to intensify, with forecasts indicating that this consumption could double by 2030.

Causes of energy intensity

The energy intensity of chatbots mainly lies in two processes: training and inference. During training, language models, such as those used by OpenAI, are fed vast datasets. This “the bigger the better” approach promotes increasingly large models, thereby exponentially increasing energy consumption.

The training process

The training process of language models requires powerful servers. A Nvidia DGX A100 server, for example, demands up to 6.5 kilowatts on its own. The complexity of training involves using multiple servers, each integrating eight GPUs. Training a single model can take weeks or even months, thus requiring substantial amounts of energy.

Inference: an energy challenge

After training, inference refers to the stage where the model generates responses based on queries. Although this process requires less computing power, it remains energy-intensive. The multiplication of queries made by millions of users increases the energy demand. Currently, OpenAI claims to process more than 2.5 billion prompts every day.

Energy assessment and transparency

Researchers like Mosharaf Chowdhury and Alex de Vries-Gao are focused on quantifying these energy needs. Their goal is to provide a clear view of the energy consumption associated with AI. To date, platforms like the ML Energy Leaderboard allow tracking of energy consumption of open-source models, but transparency remains limited for large companies.

The stakes of greater transparency

Major players in the tech sector, such as Google and Microsoft, often keep this data under wraps. This opacity complicates the assessment of the environmental impact of these systems. The lack of transparency also diminishes users’ ability to make energy-responsible decisions.

Toward sustainable evolution

Users of chatbots have a role to play. An increasing demand for better transparency could prompt companies to make this crucial information public. This evolution would support more informed and responsible energy consumption choices while encouraging decision-makers to implement sustainable policies.

For an in-depth analysis of the energy challenges related to AI and their future impact, consult the article on the interdependence between artificial intelligence and our energy needs.

Frequently asked questions about AI chatbots: Understanding their high energy consumption

Why do AI chatbots consume so much energy?
AI chatbots require a significant amount of energy mainly due to the power needed for the training and inference phases. Training uses huge datasets and multiple servers, while inference demands substantial resources because of the high number of queries processed.

What is the impact of AI chatbots’ energy consumption on the environment?
The energy consumption of AI chatbots significantly contributes to carbon emissions and electricity consumption. In 2023, data centers hosting these technologies accounted for about 4.4% of electricity consumption in the United States, and this figure is on the rise.

How much energy does an AI model like ChatGPT consume during training?
It has been estimated that training the GPT-4 model used approximately 50 gigawatt-hours of energy, which is equivalent to powering a large city like San Francisco for three days.

How can users of AI chatbots reduce their energy footprint?
Users can reduce their energy footprint by avoiding excessive queries and choosing times of lower usage to prevent overloading servers. Promoting greater transparency in the energy consumption of AI services can also help make more informed choices.

What are the differences in energy consumption between training and inference of AI chatbots?
Training requires massive energy investment due to the volume of data and computational power needed, while inference consumes less energy per query but can accumulate quickly due to the high number of users.

Why is it difficult to obtain accurate figures on AI chatbots’ energy consumption?
Large companies like Google and Microsoft often keep these data private or provide unclear statistics, making it difficult to accurately assess the environmental impact of AI chatbots.

What role can public policies play in the energy consumption of AI chatbots?
Public policies can encourage transparency and require reporting on the energy used by companies, thus allowing consumers to make more responsible choices and optimize the use of AI services.

How has the training of AI models changed over the years in terms of energy consumption?
In recent years, the size and complexity of AI models have significantly increased, leading to an exponential rise in the energy consumption required for their training.

actu.iaNon classéAI chatbots: Understanding their high energy consumption

Shocked passersby by an AI advertising panel that is a bit too sincere

des passants ont été surpris en découvrant un panneau publicitaire généré par l’ia, dont le message étonnamment honnête a suscité de nombreuses réactions. découvrez les détails de cette campagne originale qui n’a laissé personne indifférent.

Apple begins shipping a flagship product made in Texas

apple débute l’expédition de son produit phare fabriqué au texas, renforçant sa présence industrielle américaine. découvrez comment cette initiative soutient l’innovation locale et la production nationale.
plongez dans les coulisses du fameux vol au louvre grâce au témoignage captivant du photographe derrière le cliché viral. entre analyse à la sherlock holmes et usage de l'intelligence artificielle, découvrez les secrets de cette image qui a fait le tour du web.

An innovative company in search of employees with clear and transparent values

rejoignez une entreprise innovante qui recherche des employés partageant des valeurs claires et transparentes. participez à une équipe engagée où intégrité, authenticité et esprit d'innovation sont au cœur de chaque projet !

Microsoft Edge: the browser transformed by Copilot Mode, an AI at your service for navigation!

découvrez comment le mode copilot de microsoft edge révolutionne votre expérience de navigation grâce à l’intelligence artificielle : conseils personnalisés, assistance instantanée et navigation optimisée au quotidien !

The European Union: A cautious regulation in the face of American Big Tech giants

découvrez comment l'union européenne impose une régulation stricte et réfléchie aux grandes entreprises technologiques américaines, afin de protéger les consommateurs et d’assurer une concurrence équitable sur le marché numérique.