Artificial intelligence applications can use as much power as entire nations
10-10-2023

Artificial intelligence applications can use as much power as entire nations

With advancements in artificial intelligence (AI) offering potential improvements across various sectors from coding to driving, there is also an increasing concern regarding its environmental impact and the amount of energy consumed by AI applications.

In a recent commentary published in the journal Joule, Alex de Vries, the founder of Digiconomist, shed light on the significant energy footprint of widespread AI adoption, which could potentially surpass the power demands of entire nations.

Energy used to train AI services

The commentary underscores the startling revelation that the surge in demand for artificial intelligence services might result in a noteworthy escalation in energy consumption. Alex de Vries, also a Ph.D. candidate at Vrije Universiteit Amsterdam, mentioned, “Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years.”

The past year has witnessed a rapid surge in the popularity and application of generative AI. These are tools capable of generating varied forms of data, be it text or images. OpenAI’s ChatGPT is a notable example of such advancements.

However, the training phase for these tools, which involves processing vast amounts of data, is exceptionally energy-intensive. An illustration of this is Hugging Face’s revelation. The New York-based AI firm reported that their multilingual text-generating tool guzzled approximately 433 megawatt-hours (MWH) during its training phase. This amount is sufficient to provide electricity to 40 average American homes for an entire year.

Jevons’ Paradox

Yet, the energy concerns surrounding AI are not just limited to its training phase. As de Vries pointed out, when these AI tools operate — such as generating responses based on prompts — each operation demands a significant chunk of computing power and consequently, energy. A prime example of this is ChatGPT, which, if used consistently, could require a staggering 564 MWh of electricity every day.

There’s a global push to enhance the efficiencies of AI tools, both in terms of hardware and software. The aim is to lower the energy costs. However, de Vries introduced an interesting perspective, citing the Jevons’ Paradox, which posits that as technology becomes more efficient, its demand increases, leading to a net rise in resource consumption. “The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” de Vries explains.

Highlighting the potential scale of this issue, de Vries points to tech giant Google. The company is integrating generative AI in its email services and even experimenting with AI-powered search engine functionalities. Given that Google processes a colossal 9 billion searches daily, de Vries’ calculations suggest that if every Google search incorporated AI, the annual power requirement could shoot up to 29.2 TWh, equating to Ireland’s yearly electricity consumption.

AI energy projections are mind-boggling

While such a scenario seems far-fetched presently due to the financial constraints and AI server supply chain bottlenecks, projections indicate a rapid expansion in AI server production. By 2027, global AI-related electricity consumption might rise by a massive 85 to 134 TWh annually. This figure is on par with the electricity consumption of countries such as the Netherlands, Argentina, and Sweden.

Moreover, the quest for improved AI efficiency might allow developers to adapt certain computer processing chips exclusively for AI, adding further to its electricity consumption.

In his concluding remarks, de Vries emphasized the importance of a mindful approach to AI application. “The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy intensive, so we don’t want to put it in all kinds of things where we don’t actually need it.”

As AI continues its onward march, ensuring its sustainable and responsible growth will undoubtedly remain a topic of intense debate and discussion.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

News coming your way
The biggest news about our planet delivered to you each day
Subscribe