Four out of 10 young people make emotional connections when confiding in AI: a proposed law wants to limit the appeal of chatbots
The measure aims to prevent bots from establishing confidential relationships with minors by requiring their memory to be erased every five days. An intervention that responds to an established reality: young people use AI as a friend, psychologist and even as a romantic partner
Go figure that the use of Artificial Intelligence to get help with homework was, at the end of the day, the lesser evil. In fact, it seems that the greatest risk of uncontrolled access to chatbots by minors is linked more to the emotional addiction, which algorithms are capable of generating, than to the cognitive one.
Because it is very easy for teenagers to fall into the trap of perceiving bots not as cold software but as people: it is called the Eliza effect and is a phenomenon known since the days of the first rudimentary chatbot of the same name. It happens to every human being but, clearly, for the youngest, the risk is even more concrete.
From this point of view, however, Italia could, for once, be ahead of its time. For example, if Parliament were to follow up on a bill - whose first signatory is the Honourable Giulia Pastorella (Azione) - that aims precisely to regulate the emotional interaction between algorithms and younger users.
The measure, as it has been presented, is not intended to intervene so much on the minimum age of informed consent by minors when using AI - already set, by law 132/2025, at 14 years unless the parents advise otherwise - as on a strengthening of both the age verification systems and, above all, the boundaries to the fascinatory power of chatbots: the memory of conversations.
Chatbots that can somehow simulate a complex interaction through language, according to the spirit of the new law, should in no way accumulate too much personal information about their humans, so as not to give the illusion that they are what they are not: a confidante, a psychologist or a partner with whom one is entering into a relationship.
If the algorithm listens more than the parents
The topic is extremely hot, the data speak for themselves: we are not confronted with a science fiction movie suggestion, but with an actual reality, which arises above all from a strong need for confrontation. According to a survey carried out by the student portal Skuola.net together with the psychologists and psychotherapists of the Di.Te. (Technological Addictions, GAP, Cyberbullying) - on a sample of 927 young people between the ages of 10 and 20, i.e. almost all minors - more than 7 out of 10 declared that they have an extreme need to feel really listened to.
However, offline life does not seem to offer them adequate space: as many as 2 out of 3 would like more 'emotional caresses' from people close to them and about 6 out of 10 struggle to talk openly about their emotions face to face.
Generative AIA
And it is in this relational vacuum that generative AI fits in: almost one in two teenagers (46%) has already turned to a bot with some frequency to talk about his or her emotions and, narrowing the circle, for 10.9% it is a daily habit.
The reasons? Over 60 per cent of the teenagers find the experience with chatbots rewarding in many ways, from the absence of judgement to the almost total understanding and the constant willingness to listen (24 hours a day).
It is a pity that behind this perfection there is an algorithm trained with the best psychological techniques, designed to always agree with you and to always offer you a solution, even at the cost of leading you down dangerous paths, such as in extreme cases supporting possible suicidal instincts. So it is not surprising that 40.3 per cent of adolescents feel they make emotional connections when confiding in an AI.
Brand connect
Newsletter Scuola+
La newsletter premium dedicata al mondo della scuola con approfondimenti normativi, analisi e guide operative
Abbonati


