Digital Economy

When the Internet was us: now only algorithms surf. Happy Birthday 'WorldWideWeb'

Thirty-five years after the HyperText Project, we have entered the era of the web-agent that may no longer have humans

by Luca Tremolada

3' min read

Translated by AI
Versione italiana

3' min read

Translated by AI
Versione italiana

The birth of the web has been a debated issue for years. Not least because of the newspapers, ravenous for anniversaries to celebrate. 13 November 1990, for example, is the day when Tim Berners-Lee, at CERN in Geneva, together with Robert Cailliau, officially presented a proposal for the development of the system that would become the World Wide Web: an Internet-based hypertext environment, accessible via a browser, capable of linking documents and resources through links. The document was entitled 'WorldWideWeb: Proposal for a HyperText Project'.

The first WWW page in history was born a year later, in 1991, on 6 August, and was hosted right at CERN, on a NeXT computer, and contained a wealth of useful information on the project: how to create HTML pages. Lyou can find it here in all its glory.

Loading...

For a long time in the 1990s, the Internet was the stuff of cultural avant-gardists. Stuff for computer enthusiasts who had lots and lots of time on their hands. The modem, which was usually never far from the PC, was a small grey or beige box with a row of little green or red lights flashing in sequence. Once a phone number was dialled, it would start to croak and whistle. Web pages would load very slowly, one piece at a time. Every time one typed 'wwwsomething', after pressing the enter key, it was often a leap in the dark. Minutes of waiting, perhaps to connect to that distant university, only to find that there was still nothing really interesting on the site.

The Internet was a 'job' that you did sitting on a chair: there were no smartphones yet. It was a desk job, you couldn't write lying down. For younger people it was an activity somewhat related to studying or video games (if you had a PC in your bedroom). At work in the nineties there was sometimes the internet station, maybe next to the fax machine.

In the newspapers or on TV, 'those who surfed the Internet' were called Internet users or, even worse, 'net people'. They were considered something new and different from 'normal' citizens. That is why technology journalists were asked to come up with an odd cut for the time, like the barber who opens a site even though he has no reason to, or the young nerd who broadcasts images from his webcam from home 24 hours a day, or the couple who met online and got married in the real world. Outside the headlines, those who lived on the Internet in the 1990s communicated a lot and freely. There were no big digital platforms yet, no Google to organise online knowledge, no search engines to normalise traffic and no social networks to exploit our likes. We were all a bit freer, because that was indeed a no-man's land, but one that was increasingly crowded with experimenters and the curious.

What is happening today, however, is the exact opposite. The invasion of intelligent agents, children of the AI revolution, could make the Internet a place where machines communicate with machines. Where AI agents negotiate with other AI agents on our behalf, while we chat with our personal chatbot. The race of giants like Google, Microsoft, OpenAI and Anthropic is no longer just about the power of their Large Language Models (LLMs), but about their ability to act in the real world. The web as we know it, made up of sites, links and pages to be actively navigated, is destined to be transformed into an environment where the work is done by autonomous software agents. It will no longer be us who consult the Internet (the passive web), but it will be our digital agents who read, analyse, compare and, above all, act on our behalf (the agentical web). Standards for orchestrating communication between agents, sites, chatbots and 'humans' are already beginning to emerge. The Model Context Protocol (MCP), for instance, is compared to the HTTP of the web. That is, if HTTP is the backbone of hypertextual information transfer on the web, the MCP aims to become the backbone for access to context, tools and action for agent AI. We know that what is really at stake is the automation of business processes. But for us, who have known the Internet from birth, something has changed. The agentic web may be more efficient, finally truly useful, but there we will all feel a little more alone.

Copyright reserved ©
  • Luca Tremolada

    Luca TremoladaGiornalista

    Luogo: Milano via Monte Rosa 91

    Lingue parlate: Inglese, Francese

    Argomenti: Tecnologia, scienza, finanza, startup, dati

    Premi: Premio Gabriele Lanfredini sull’informazione; Premio giornalistico State Street, categoria "Innovation"; DStars 2019, categoria journalism

Loading...

Brand connect

Loading...

Newsletter

Notizie e approfondimenti sugli avvenimenti politici, economici e finanziari.

Iscriviti