Supercomputing enters the enterprise to accelerate innovation
Analysts estimate that by 2027, 40 per cent of companies globally will invest in computing infrastructure designed for Ai
The adoption of artificial intelligence in the enterprise is nearing a turning point. Soon, the game will no longer be played only on software, but also on computing power. By 2027, 40 per cent of companies will invest in supercomputing infrastructure designed for artificial intelligence, with the aim of developing advanced applications, increasing productivity or defending their competitive advantage. Or all of the above together. The need for computing power will reshape the priorities for IT spending, opening up a new cycle of industrial investment.
The scenario is described in the report Top strategic technology trends for 2026, in which Gartner analysts outline the technology trends for the next five years. If confirmed, it could mark the start of a new industrial phase for artificial intelligence at the service of business. A phase marked by the development of more mature and complex models, the automation of critical processes and, not least, the need to accelerate innovation.
The convergence of supercomputing and artificial intelligence is nothing new. Indeed, it is at the basis of the birth of generative artificial intelligence, launched in 2022 by OpenAI with ChatGpt, and all its derivatives. Birth that was made possible precisely by the encounter between Nvidia's powerful Gpu and algorithms capable of training large language models (LLM). This encounter made intelligence artificial, giving birth to a technology capable of understanding natural language and autonomously generating texts, videos and images.
This imitation game is based on the training of models. And training requires huge computing resources. The international body Ieee (Institute of electrical and electronics engineers) estimates that data centres already consume more globally to power artificial intelligence than Mexico as a whole: 600 TWh per year against 550. Admittedly, OpenAi's competitors such as China's DeepSeek have shuffled the cards by launching generative AIs that, for the same performance, run on fewer computing resources. But the same cannot be said for their training.
All this is making supercomputing a scarce resource. Computing centres have become centres of power. A power that globally is in the hands of a few private big techs with supranational dimensions: Google, Amazon and Microsoft. The same private individuals who have invested hundreds of billions to buy Gpu's - a single Nvidia H200 chip can now cost up to $40,000 - and transform data centres from infrastructures for cloud services and data storage into centres where artificial intelligences are trained and put to work, with the sole aim of generating profits.


