The data

Microsoft and Google consume more energy than Nigeria

Artificial intelligence development raises requirements and drives away carbon neutrality targets for US big tech

by Biagio Simonetta

3' min read

3' min read

On the one hand, plans - increasingly in the balance - to achieve carbon neutral targets within a few years. On the other, the artificial intelligence boom, which has plunged the technology sector into new energy-hungry logics. And this is where recent studies on the consumption of tech giants come in, precisely in light of the AI explosion. Studies that have found that by 2023, for example, Google and Microsoft - put together - will have consumed more energy than Nigeria (with 224 million inhabitants), or Ireland. And individually more than nations like Croatia, Jordan or Puerto Rico.

The hunger of AI

.

Let us, however, take a step back. We were talking about artificial intelligence, which is certainly 'guilty' of this ravenous demand for energy. The large data centres that run behind AI, after all, require massive doses of energy for their calculations. So much so that many carbon-neutral projects have been put on the back burner (according to a study published by Standard & Poor's, the divestment of coal-fired power generation was 40% lower than expected by 2023) while waiting for better times.

Loading...

According to a recent estimate from the Vrije Universiteit in Amsterdam, the entire artificial intelligence industry could consume between 85 and 134 Terawatt hours per year by 2027. And although the various models of GenAI have already undergone some interesting slimming treatment in terms of consumption, the estimates do not ease the doubts about the long-term sustainability of this technology.

A recent study published in Medium found, for example, that OpenAI's GPT-4 training utilised up to 62,000 megawatt hours, equivalent to the energy needs of 1,000 US households over five to six years.

New chips needed

.

The point is that current chips are definitely energy-intensive. Nvidia's H100 microprocessor - which is the most sought after and also used throughout the AI world (it drives ChatGPT and other GenAI systems) - consumes around 700 watts. And a small data centre has at least 400 of these chips in it (while a large one has as many as 8,000). A huge amount of energy, which is turning the need for less energy-intensive chips into an emergency. The risk, trumpeted by many, is that AI development may soon come to a standstill because it is not sustainable.

Microsoft and Google run

.

Thanks to AI, Microsoft has become the most capitalised company on Wall Street. But its bet on generative artificial intelligence and the cloud is weighing on consumption. And one need only peruse the Redmond giant's '2024 Environmental Sustainability Report' to discover that in just four years, Microsoft's electricity consumption has more than doubled from 11 TWh in 2020 to 24 TWh in 2023.

24 TWh is the same as Google's declared consumption in the same year. The Mountain View giant, in its annual environmental report, disclosed that 2023 emissions increased by 13 per cent year-on-year and 48 per cent in five years, totalling 14.3 million tonnes of carbon dioxide. A substantial increase that highlighted 'the challenge of reducing emissions while increasing computing intensity and growing investment in the technical infrastructure to support this transition to artificial intelligence', Google wrote.

It must be said that both Google and Microsoft have long been committed to renewable energy projects. The Californian giant aims to operate with carbon-free energy 24/7 in all its data centres by 2030. While Microsoft is committed to becoming carbon-negative (also by 2030). Challenges that seemed within reach before the arrival of AI.

Copyright reserved ©
Loading...

Brand connect

Loading...

Newsletter

Notizie e approfondimenti sugli avvenimenti politici, economici e finanziari.

Iscriviti