AI is making the energy future sustainable, but its energy demand is enormous

Artificial Intelligence can enhance sustainability, but it is not inherently sustainable.

Whether Artificial Intelligence (AI) is good or bad depends not only on the value it can bring but also on its environmental consequences. Especially in the energy sector, expectations for AI applications are high. In recent years, energy and electricity markets have increasingly focused on artificial intelligence as a tool. AI has the ability to automate and optimize a range of energy-related processes, leading to more efficient and cost-effective processes, better energy management, and reduced negative impacts on the environment.

AI is considered a key element in achieving a sustainable, environmentally friendly, and efficient energy future. It should not be forgotten that its applications extend across all industries. AI has become a general-purpose technology.

When considering risks, we often think of job losses or data privacy rather than the ecological footprint. AI is not inherently sustainable, but it often aids in achieving sustainability goals more quickly. This is demonstrated by use cases in the energy sector.

Use cases in the energy sector

These are just a few examples. In the coming years, it is expected that use cases across the entire energy industry will significantly increase. Companies are developing their own GPT models, such as E.ON GTP, a generative artificial intelligence developed for the corporation. It aids in research, text processing, and serves as a virtual sparring partner for energy-related questions from employees.

For nearly two years now, the popular chatbot ChatGPT has been available. Within just two months, ChatGPT had around 100 million users. There are increasingly more tools in all variations. To train AI tools, they need to be fed with large amounts of data. This process is very energy-intensive. Each query to ChatGPT and others consumes between three and nine watt-hours of electricity. ChatGPT alone receives more than 195 million requests per day. Google processes nine billion search queries daily. A recent study by the VU Amsterdam School of Business and Economics estimates that by 2027, AI could consume as much electricity as Ireland. Additionally, there is a high water demand for cooling the data centers.

AI consumes a significant amount of energy for the learning process, and at the same time, the models are becoming larger to achieve higher accuracy. While companies worldwide are working to improve the efficiency of AI software, the demand for these tools continues to rise unabated. Researchers refer to a Jevons paradox, which means that an increase in efficiency in using a resource can lead to an increase in the overall consumption of that resource, rather than a decrease. This contradicts the intuitive assumption that higher efficiency automatically leads to lower resource consumption. We need data centers that operate 100 percent on renewable energy and where the waste heat from the servers is put to meaningful use. Everything is interconnected. AI models must take responsibility and set a good example by addressing their own energy and water footprint.

Doris Höflich, Market Intelligence Senior Expert

Sources: