AI energy consumption is becoming a major topic due to the colossal amounts of energy needed to use Google, talk to Siri, or ask ChatGPT to do something.
One study estimates that by 2027, AI energy consumption will equal the amount of energy it takes to power Argentina or Sweden.
A single ChatGPT prompt is estimated to consume, on average, as much energy as forty mobile phone charges.
Despite this, the research community and the industry have yet to make the development of AI models that are energy efficient and more climate-friendly the focus, according to computer science researchers at the University of Copenhagen.
Why is AI energy consumption so big?
Training AI models consumes a lot of energy and thereby emits a lot of CO2e. This is due to the intensive computations performed while training a model, typically run on powerful computers.
AI energy consumption is particularly prominent for large models, like the language model behind ChatGPT.
AI tasks are often processed in data centres, which demand significant amounts of power to keep computers running and cool. The energy source for these centres, which may rely on fossil fuels, influences their carbon footprint.
Assistant Professor Raghavendra Selvan from Copenhagen’s Department of Computer Science explained: “Today, developers are narrowly focused on building AI models that are effective in terms of the accuracy of their results.
“As a result, AI models are often inefficient in terms of energy consumption.”
But the new study, of which he and computer science student Pedram Bakhtiarifard are two of the authors, demonstrates that it is easy to curb a great deal of AI energy consumption without compromising its precision.
Reducing the climate impact of AI
In their study, the researchers calculated the energy it takes to train more than 400,000 convolutional neural network-type AI models – this was done without actually training all these models.
Based on the calculations, the researchers present a benchmark collection of AI models that use less energy to solve a given task but which perform at approximately the same level.
The study shows that by opting for other types of models or by adjusting models, AI energy consumption can be reduced by 70-80% during the training and deployment phase, with only a 1% or less decrease in performance.
Pedram Bakhtiarifard said: “The recipes describe not just the performance of different algorithms but also how energy efficient they are.
“By swapping one ingredient with another in a model’s design, one can often achieve the same result. So now, practitioners can choose the model they want based on both performance and energy consumption without needing to train each model first.”
Ensuring sustainable and responsible AI development
The researchers stress that in some fields, like self-driving cars or certain areas of medicine, model precision can be critical for safety.
It is important not to compromise on performance. However, this shouldn’t be a deterrence to strive for high energy efficiency in other domains.
Selvan concluded: “AI has amazing potential. But if we are to ensure sustainable and responsible AI development, we need a more holistic approach that considers not only model performance but also climate impact.
“Here, we show that it is possible to find a better trade-off. When models are developed for different tasks, AI energy efficiency ought to be a fixed criterion – just as it is standard in many other industries.”