Search Skuola.net

Four out of 10 young people make emotional connections when confiding in AI: a proposed law wants to limit the appeal of chatbots

The measure aims to prevent bots from establishing confidential relationships with minors by requiring their memory to be erased every five days. An intervention that responds to an established reality: young people use AI as a friend, psychologist and even as a romantic partner

by School Editorial

 Alamy Stock Photo

6' min read

Translated by AI
Versione italiana

6' min read

Translated by AI
Versione italiana

Go figure that the use of Artificial Intelligence to get help with homework was, at the end of the day, the lesser evil. In fact, it seems that the greatest risk of uncontrolled access to chatbots by minors is linked more to the emotional addiction, which algorithms are capable of generating, than to the cognitive one.
Because it is very easy for teenagers to fall into the trap of perceiving bots not as cold software but as people: it is called the Eliza effect and is a phenomenon known since the days of the first rudimentary chatbot of the same name. It happens to every human being but, clearly, for the youngest, the risk is even more concrete.

From this point of view, however, Italia could, for once, be ahead of its time. For example, if Parliament were to follow up on a bill - whose first signatory is the Honourable Giulia Pastorella (Azione) - that aims precisely to regulate the emotional interaction between algorithms and younger users.

Loading...

The measure, as it has been presented, is not intended to intervene so much on the minimum age of informed consent by minors when using AI - already set, by law 132/2025, at 14 years unless the parents advise otherwise - as on a strengthening of both the age verification systems and, above all, the boundaries to the fascinatory power of chatbots: the memory of conversations.
Chatbots that can somehow simulate a complex interaction through language, according to the spirit of the new law, should in no way accumulate too much personal information about their humans, so as not to give the illusion that they are what they are not: a confidante, a psychologist or a partner with whom one is entering into a relationship.

If the algorithm listens more than the parents

The topic is extremely hot, the data speak for themselves: we are not confronted with a science fiction movie suggestion, but with an actual reality, which arises above all from a strong need for confrontation. According to a survey carried out by the student portal Skuola.net together with the psychologists and psychotherapists of the Di.Te. (Technological Addictions, GAP, Cyberbullying) - on a sample of 927 young people between the ages of 10 and 20, i.e. almost all minors - more than 7 out of 10 declared that they have an extreme need to feel really listened to.
However, offline life does not seem to offer them adequate space: as many as 2 out of 3 would like more 'emotional caresses' from people close to them and about 6 out of 10 struggle to talk openly about their emotions face to face.

Generative AIA

And it is in this relational vacuum that generative AI fits in: almost one in two teenagers (46%) has already turned to a bot with some frequency to talk about his or her emotions and, narrowing the circle, for 10.9% it is a daily habit.
The reasons? Over 60 per cent of the teenagers find the experience with chatbots rewarding in many ways, from the absence of judgement to the almost total understanding and the constant willingness to listen (24 hours a day).
It is a pity that behind this perfection there is an algorithm trained with the best psychological techniques, designed to always agree with you and to always offer you a solution, even at the cost of leading you down dangerous paths, such as in extreme cases supporting possible suicidal instincts. So it is not surprising that 40.3 per cent of adolescents feel they make emotional connections when confiding in an AI.

"Chat", the pocket psychologist always available

Starting from these premises, it is easy to understand why the step from seeing in the chatbot a virtual friend to considering it a 'psychologist' is quite short and fast. On this point, another survey - carried out by Skuola.net alone on 2,000 girls and boys between the ages of 11 and 25 - reveals precisely that 15% of young people use Artificial Intelligence (above all ChatGPT) on a daily basis specifically to vent and ask for personal advice. And if one also expands the discourse to those who have at least a weekly relationship with 'Chat' (a friendly nickname used by young people for the most popular AI) as a confidante or psychologist, the percentage rises to 25%.

The reasons

The reasons for this choice are pragmatic and emotional at the same time: 38% turn to a bot to open the door to their intimacy because it is available at all times, 31% see it as a form of self-help that they can manage on their own, and 28% seek an objective and unbiased assessment of their condition online.
The risk of developing an addiction to this comparison is, however, just around the corner: it is no coincidence that 1 in 3 users felt they could no longer do without these conversations, as if they had established real empathic ties with the machine.

From mental connection to "butterflies in the stomach"

The most extreme level of this connection is, however, the properly sentimental one. On the occasion of last Valentine's Day, Skuola.net in fact also investigated the role of chatbots in love dynamics, polling 1,000 young people between the ages of 14 and 25.
Well, almost half of those interviewed (42%) admitted to having used AI at least once for matters of the heart. And for 10%, the algorithm has become the priority trusted advisor.
But there are those who go beyond mere advice: more than 1 in 10 seem to have become emotionally attached to a chatbot more than they should. Five per cent even claimed to have felt for the algorithm the so-called 'butterflies in the stomach', typical of when falling in love with a real person, while eight per cent spoke 'only' of a very deep mental connection.
Not to mention the three per cent who claim to have strong feelings for an AI right now. With a tiny (but not negligible) 1% even describing themselves as being in a real relationship with a chatbot.

Distorted human relations

What's more, the use of the AI as an ideal love confidant is also distorting the perception of human relationships: 13% of users said that the relationship with the bot is raising the standards sought in reality, making flesh-and-blood people more tiring and complicated to handle.
It is precisely in order to prevent this algorithmic 'perfection' from manipulating the emotional fragility of minors that the Pastorella bill aims to impose a limit: the AI can respond, but it must not be able to remember or seduce.
An intervention that, in the light of the data, seems at the very least opportune, to preserve the authenticity of human bonds in the new generations.

A law to 'cool' relations between minors and AI

But what exactly does the draft measure presented by the Action MP say? The rule, as anticipated, does not aim to ban the use of artificial intelligence by minors, but to regulate its emotional impact. The heart of the text, in fact, imposes an obligation on platforms to delete within 5 days the memory of conversations with underage users that could involve emotional involvement.
The logic behind this stricture is that without a mnemonic continuity a relationship cannot be built. And by preventing the algorithm from accumulating data and adapting itself over time to the user, the chatbot would go back to being a simple technological tool, defusing the risk of it turning into a friend, a romantic partner or a fake psychologist.
In addition to the information retention limit, the legislative text then introduces further elements to make protection effective. The technical part. Establishing, first of all, the obligation for service providers to verify the age of users to distinguish minors from adults. The task of defining the concrete modalities to carry out this verification and to supervise the application of the rule would be entrusted to Agcom (the Communications Guarantee Authority).
The structure of the law, finally, also provides for specific awareness campaigns on the subject. In the awareness that the illusion of empathy created by machines must be regulated, so as not to generate further social isolation.
The objective, in short, is to not allow machines to manipulate the fragility of the youngest and to return to the State and families the responsibility of establishing the boundaries in the relationship between children and technology.

Scuola, tutti gli approfondimenti

La newsletter di Scuola+

Professionisti, dirigenti, docenti e non docenti, amministratori pubblici, operatori ma anche studenti e le loro famiglie possono informarsi attraverso Scuola+, la newsletter settimanale de Il Sole 24 Ore che mette al centro del sistema d'istruzione i suoi reali fruitori. La ricevi, ogni lunedì nel tuo inbox. Ecco come abbonarsi

Le guide e i data base

Come scegliere l’Università e i master? Ecco le guide a disposizione degli abbonati a Scuola+ o a 24+. Qui la guida all’università con le lauree del futuro e il database con tutti i corsi di laura

Lo speciale ITS

Il viaggio del Sole 24 Ore negli Its per scoprire come intrecciare al meglio la formazione con le opportunità di lavoro nei distretti produttivi delle eccellenze del made in Italy. Tutti i servizi

Copyright reserved ©
Loading...

Brand connect

Loading...

Newsletter Scuola+

La newsletter premium dedicata al mondo della scuola con approfondimenti normativi, analisi e guide operative

Abbonati