Social and minors between bans and digital education in Europe and Australia
Global debate highlights the need for effective regulations, corporate responsibility and safety training without excluding young people
Australia's ban on under-16 access to social media has reignited a global debate that directly affects children and teenagers. It is a radical measure, born of a desire to protect them from real risks such as addiction, cyberbullying and exposure to harmful content, but it can have counterproductive effects
In Europe, the issue of online child protection has strongly entered the political agenda. The European Parliament's recent resolution relaunched the hypothesis of a uniform digital age of consent and in November the EU Council laboriously reached an agreement on the draft Regulation for the prevention of online child abuse. Meanwhile, states are working to make the age verification provided by the Digital Services Act a reality; in Italy, legislative initiatives revolve around age restrictions on access to inappropriate content and an update of the Italian Society of Paediatrics guidelines pushes for high age thresholds. All this is a sign of a strong desire for protection, but also of an unresolved tension between safety and participation. The discussion is not only about mental health or cognitive development, but also concerns the quality of democracy and the ability of institutions to guarantee minors safe and conscious access to digital environments, where an important part of the social life of adolescents and the very young takes place.
It is essential for states to address the need for child safety online, but the real challenge is not to set an age, but to ensure that children can grow up in safe and appropriate digital environments. A blanket ban like the one approved in Australia risks creating a false sense of security, moving children to unregulated platforms and shifting the responsibility to families and schools. Instead, a quantum leap is needed: platforms designed with security built in, privacy and protection by default, responsible algorithms and age-appropriate interfaces. In other words, security must be built around children, not imposed against them.
For millions of teenagers, social media are spaces for relationships, information, and civic participation. Depriving them of these opportunities means reducing their voice in society.
As the Australian case has highlighted, the issue also concerns the responsibility of technology companies. One cannot continue to shift the burden of protection onto parents or teachers, who often lack adequate tools in a rapidly changing environment. Companies must take on binding obligations: automatic security settings for minors' accounts, a ban on high-risk features such as public geolocation or anonymous chats, transparency on algorithms and recommendation systems. The protection of minors cannot be an optional extra, but a legal requirement that those who manage digital spaces must ensure.


