Artificial intelligence systems could account for almost half of the data center’s power consumption by the end of this year, the analysis revealed.
The estimates of the Digiconomist Tech Sustainability website founder Alex de Vries-Gao, just like Japan uses today, have come to be a prediction of the International Energy Agency that AI needs by the end of the decade.
The De Vries-Gao calculations featured in the Sustainable Energy Journal Joule are based on the power consumed by chips and advanced microdevices created by Nvidia, which are used to train and operate AI models. This paper also takes into account the energy consumption of chips used by other companies such as Broadcom.
The IEA estimates that all data centers (excluding cryptocurrency mining) consumed 415 terawatt hours (TWH) of electricity last year. De Vries-Gao claims in his research that AI can already account for 20% of that total.
He said many variables have appeared in his calculations, including data center energy efficiency and power consumption associated with server cooling systems that handle the busy workloads of AI systems. Data centers are the central nervous system of AI technology, and their high energy needs make sustainability a key concern in the development and use of artificial intelligence systems.
By the end of 2025, De Vries-Gao estimates that energy consumption by AI systems could approach up to 49% of total data center power consumption. AI consumption could reach 23 gigawatts (GW), twice the total energy consumption in the Netherlands.
However, De Vries-Gao said many factors could lead to slowing demand for hardware, such as a decline in demand for applications such as ChatGPT. Another issue is geopolitical tensions that create constraints on the production of AI hardware, such as export control. De Vries-Gao cites an example of barriers to Chinese access to chips, which contributed to the release of the Deepseek R1 AI model, which allegedly used fewer chips.
“These innovations can reduce AI calculations and energy costs,” De Vries said.
However, he said that increased efficiency could further promote the use of AI. Several countries (the trend known as “sovereign AI”) looking to build their own AI systems could also increase the demand for hardware. De Vries-Gao also pointed to US Data Centre startup Crusoe Energy, which secured 4.5GW of gas-powered energy capacity for infrastructure, leading ChatGPT developer Openai through Stargate’s joint venture to lead among potential customers.
“There are early indications that these (Stargate) data centers could exacerbate their dependence on fossil fuels,” writes De Vries-Gao.
On Thursday, Openai announced the launch of its Stargate project in the United Arab Emirates, outside the United States.
After the newsletter promotion
Microsoft and Google confirmed last year that AI is at risk for its ability to achieve internal environmental goals.
De Vries-Gao said there is less and less information about AI’s power demands, and analysts describe it as a “opaque industry.” The EU AI Act requires AI companies to disclose the energy consumption behind model training, but not daily use.
Professor Adam Sobey, mission director for sustainability at the UK’s Alan Turing Institute, an AI research institute, said there is a need for more transparency about the amount of energy consumed by artificial intelligence systems and how much money can be saved by making carbon removal industries, such as transportation and energy.
Sobey said: