Electricity eaters KI: How Chatgpt puts our climate!

OpenAI berichtet über den Energieverbrauch von KI-Anfragen und zukünftige Herausforderungen durch steigenden Strombedarf.
Openai reports on the energy consumption of AI requests and future challenges due to increasing electricity needs. (Symbolbild/DNAT)

Electricity eaters KI: How Chatgpt puts our climate!

Deutschland - The use of artificial intelligence (AI) is more and more ecological challenges. According to Openaai, a request from Chatgpt, one of the most famous AI platforms, consumes as much electricity as a second operation of an oven. Openai boss Sam Altman sees both a problem and an opportunity for the future of technology. He expresses himself optimistically about the role of AI in a wealthier future, despite the concerns about possible job losses.

Some numbers illustrate the high resource consumption of AI systems. The water consumption for a single request is only one fifth of a teaspoon. Nevertheless, this consumption adds up enormously if you look at the large number of daily inquiries. According to Tagesschau , the training of chatt-3 requires an estimated 5.4 million liters of water, including 700,000 liters for the cooling of the data centers.

The growing energy requirement

The technology companies Microsoft, Google and Amazon already have strategic plans to meet their increasing energy requirements. Among other things, they rely on nuclear energy so as not to increase their carbon dioxide emissions. The energy requirements of the data centers that support AI applications is a big topic. A worrying trend shows that electricity consumption in data centers in Germany rose by 70 percent from 2010 to 2021. This is slower than the increase in the need on these facilities, since the servers become more efficient.

Another aspect is the water consumption that is essential for cooling the server. By 2030, water consumption of 664 billion liters is expected to be forecast for the cooling of servers, almost four times as much as 2023. Here, the operation of air and water cooling systems that require significantly water.

environmental policy and sustainability

The increase in the use of AI leads to a worrying increase in greenhouse gas emissions. Forecasts show that emissions will increase from 212 million tons to 355 million tons by 2030. In order to fight this negative development, companies and political decision -makers are required to take measures. On behalf of Greenpeace Germany, the eco-institute recommends binding transparency requirements and an efficiency label for data centers.

A future-oriented approach could be the integration of data centers into renewable energy and heating networks. The aim is to ensure that AI developments do not hinder climate protection, but actively contribute to the energy transition. However, the debate about the sustainability of AI is only at the beginning, warns Ingenieur.de. social framework to be created in order to optimally use the opportunities of the AI ​​for climate protection and to minimize the risks.

In view of the challenges that arise from the increasing energy and water consumption of AI data centers, there is an urgent need for further studies in order to measure environmental pollution more precisely and to develop measures for reduction. The discussion about AI and its environmental consequences will continue to become more important.

Details
OrtDeutschland
Quellen