Doubts, answers, missed laws: delayed protection for minors
The correspondence, which emerged from the records, shows an internal debate at Big Tech since 2017
Key points
At the trial a flood of internal emails from employees worried about thewellbeing of users, that of their children, and the scarce resources invested in the safety of teenagers. Among them, that of an executive from Meta's design department who wrote directly to Mark Zuckerberg reporting that she would not "sleep at night" if she did not tell him of her concerns. One of her two teenage daughters, in fact, had also been hospitalised twice for body dysmorphism, and - according to her - the social network should have eased the pressure on unattainable beauty models to minimise possible risks for minors.
'One day, looking back,' the woman wrote, 'I hope we can be proud of the decisions we made'. A sentence that sounds like a warning, while the 'Team Teens' created by Zuckerberg was all about investing in engineers, enlisted to keep teenagers glued to their screens and study new solutions to not lose users.
The documents
In an email dated 1 April 2020, Zuckerberg himself called it 'patronising' to limit people's ability to present themselves as they wanted, referring to the internal discussion on whether or not to keep thebeauty filters on Instagram. 'I'm not sure,' he continued, 'that at this point it's easy to change the setting.' For Zuckerberg, there was no clear data on the harm of using the filters or the operation of his algorithms.
In some internal emails in 2018, studies were cited for which the use of social media would have the same effect on adolescent depression as 'eating potatoes'. An underestimation of the risk, in the light of the latest judgments, had resulted in a dense correspondence that shows a heated internal debateas early as 2017 and an attempt to shift external concerns to other fronts. Meta executives monitored the news coming out of major media outlets around the world and the legislation passed, trying to find 'plausible' answers to agree on in their public releases.
As early as 2018, the research team was looking for scientific arguments to counter the narrative of "social media addiction", arguing that the causes should also be sought elsewhere, for example, in social inequalities or individual health problems. A script that did not seem to hold up in the face of much scientific evidence and criticism from Meta's own top management, who wrote: 'The fact that we have age limits that are not respected makes it difficult to claim that we are doing everything possible'. Meta's wellbeing managers, for whom low investments were intended compared to engineering ones, continually raised questions and doubts, demonstrating that they cared about the problem, but could not do enough: "Is there any way to immediately redirect a person to support if they try or try to postself-harming content, instead of showing it to them?"; or even: "We cannot put all the blame on the user, without acknowledging our role in the problem". And again: 'The fact that we say we don't allow children under 13 access to our platforms, but have no way to enforce it, is simply indefensible'.



