Technology

Supercomputing enters the enterprise to accelerate innovation

Analysts estimate that by 2027, 40 per cent of companies globally will invest in computing infrastructure designed for Ai

by Antonio Larizza

Il supercomputer tedesco Jupiter Booster, primo sistema di calcolo di classe exascale acceso in Europa (foto di Sascha Kreklau)

3' min read

Translated by AI
Versione italiana

3' min read

Translated by AI
Versione italiana

The adoption of artificial intelligence in the enterprise is nearing a turning point. Soon, the game will no longer be played only on software, but also on computing power. By 2027, 40 per cent of companies will invest in supercomputing infrastructure designed for artificial intelligence, with the aim of developing advanced applications, increasing productivity or defending their competitive advantage. Or all of the above together. The need for computing power will reshape the priorities for IT spending, opening up a new cycle of industrial investment.

The scenario is described in the report Top strategic technology trends for 2026, in which Gartner analysts outline the technology trends for the next five years. If confirmed, it could mark the start of a new industrial phase for artificial intelligence at the service of business. A phase marked by the development of more mature and complex models, the automation of critical processes and, not least, the need to accelerate innovation.

Loading...

The convergence of supercomputing and artificial intelligence is nothing new. Indeed, it is at the basis of the birth of generative artificial intelligence, launched in 2022 by OpenAI with ChatGpt, and all its derivatives. Birth that was made possible precisely by the encounter between Nvidia's powerful Gpu and algorithms capable of training large language models (LLM). This encounter made intelligence artificial, giving birth to a technology capable of understanding natural language and autonomously generating texts, videos and images.

This imitation game is based on the training of models. And training requires huge computing resources. The international body Ieee (Institute of electrical and electronics engineers) estimates that data centres already consume more globally to power artificial intelligence than Mexico as a whole: 600 TWh per year against 550. Admittedly, OpenAi's competitors such as China's DeepSeek have shuffled the cards by launching generative AIs that, for the same performance, run on fewer computing resources. But the same cannot be said for their training.

All this is making supercomputing a scarce resource. Computing centres have become centres of power. A power that globally is in the hands of a few private big techs with supranational dimensions: Google, Amazon and Microsoft. The same private individuals who have invested hundreds of billions to buy Gpu's - a single Nvidia H200 chip can now cost up to $40,000 - and transform data centres from infrastructures for cloud services and data storage into centres where artificial intelligences are trained and put to work, with the sole aim of generating profits.

But all is not lost. The European Union - lagging behind in product development, but more attentive to the common good - is today the continent with the largest public supercomputing infrastructure in the world. Around this infrastructure, the first AI factories have sprung up - including the Italian one in Bologna, financed with 430 million euro - and five AI Gigafactories are already being studied. The aim is to make computing resources and expertise available to industries, but also to SMEs and public administrations, in order to design a European path to artificial intelligence.

As certified by the Top500, the ranking of the world's most powerful supercomputers, today the machine with the most computing power is American. It is called El Capitan and can perform 1.7 quintillion operations per second. To develop the same computing power, a human being would have to perform one operation per second for 87 billion years. With 18 supercomputers in the ranking, our country ranks fourth after the United States (far behind) and Japan and Germany (very close). Europe also recently recorded the switch-on of the German supercomputer Jupiter Booster, Europe's first exascale-class computing system.

While there is no shortage of computing infrastructures, the possibility that Italian and European companies will start using them on a massive scale, as predicted globally by Gartner analysts, is not a foregone conclusion. A survey commissioned by IT4LIA - the Italian AI Factory - and conducted between October and November 2025 on a sample of more than 200 organisations including SMEs, large enterprises, start-ups, universities, research centres and public administrations, shows that 65% of the respondents report a lack of skills as the main obstacle to the adoption of AI. This is followed by limited budgets and high costs (50 per cent), the need for support in defining projects (48 per cent), difficulties in identifying use cases (45 per cent), problems related to data management and quality (38 per cent), and uncertainties about the AI Act, Gdpr and regulatory compliance (33 per cent). Attention to artificial intelligence is high, but the ability to implement projects limited. Overall, one in two organisations risks being excluded from advanced AI experiments not due to lack of interest, but due to difficulties in accessing expertise, infrastructure and specialised support.

Copyright reserved ©
Loading...

Brand connect

Loading...

Newsletter

Notizie e approfondimenti sugli avvenimenti politici, economici e finanziari.

Iscriviti