In November 2024, just as Australia announced a ban to prevent children under sixteen years old from accessing social media through the Online Safety Amendment (Social Media Minimum Age) Act 2024 (which was passed in December 2024), eSafety Commissioner Julie Inman Grant joined an academic round table hosted by the DFC during her visit to UK for the Global Online Safety Regulators Network. The eSafety Commissioner is tasked with implementing and monitoring adherence to the act.
Meeting participants included academics engaged in a wide range of current research with children in the digital environment, including work on experimentation with algorithmic feeds and the possibilities for algorithmic redesign; digital literacy (including AI literacy); and children’s accidental exposure to a range of content, contact, and conduct risks, most recently concerning sexual extortion and deep fakes, in cluding in the metaverse.
The value of social media
many academics recognised the shortcomings of a ban and would not consider it supported by current research
The roundtable highlighted the importance that social media holds for children and young people, who use social media as a space for connection and community.
Special concerns were expressed about disadvantaged children, for example children in care, children as carers and children who suffer with mental health issues.
Furthermore, LGBTQIA+ children and other minority groups have recognisable communities in social media, which the panel expressed has very positive impacts.
Many academics recognised the shortcomings of a ban and would not consider it supported by current research.
Not all tech created equal
platforms reflect their users’ behaviours and their own underlying business models
However, the roundtable also stressed that children are at risk of being exposed to various harms in most digital environments. Even if ‘algorithmic perfection’ were possible, a risk of harm would still exist in social media environments as platforms reflect their users’ behaviours and their own underlying business models.
That said, the academics pointed out that some apps and platforms are riskier than others. Snapchat, for example, was highlighted as more problematic than most, with little effective age verification and ‘mass adding’ functions. Snapchat poses further monitoring issues too, as it operates in the balance between a social media and messaging platform.
Legislative and reporting issues
we must not make “perfect the enemy of good”
Despite recent improvements in legislation, the roundtable addressed legislative barriers that exist for children’s safety. These included obscure and/or difficult reporting routes, big tech’s reluctance to act upon existing reports and research, and children as unclear as to where legal boundaries lay i.e. when internet activities become illegal.
The academic roundtable conferred that we must not make “perfect the enemy of good”: pushing through some efficient legislation was agreed to be better than further legislative delay.
Is digital literacy the answer?
digital and AI literacy [...] was highlighted as a key priority for policy intervention.
Digital and AI literacy education of children, parents, and education practitioners was highlighted as a key priority for policy intervention. Some argued that a peer-to-peer delivery of AI literacy would be the most effective and others emphasised both formal and informal teaching mechanisms.
The roundtable also stressed that digital literacy should not be the only intervention, particularly as teachers, parents and peers can lack resources and capacity to teach the topic successfully.
Where do we go from here?
Amongst other suggestions, the roundtable highlighted future considerations:
- Strengthening collaboration between disciplines (advocacy, law, data science and academia) and improving understanding of systems within big tech companies is essential for change
- Recognising and limiting the risks of the scientific community that conduct research in this domain
- Ensuring that companies are punished if they have ‘underage’ users, not the children for accessing them
- Monitoring and studying any social media ‘ban’ for children under 16 to determine how it impacts the rights of children.
The DFC would like to thank all participants for their time and contributions.
Roundtable participants
Julie Inman Grant. Australia’s eSafety Commissioner.
Dr Aiman El Asam, Associate Professor in Forensic Psychology, Kingston University London.
Professor Sarah-Jayne Blakemore, Professor of Psychology ad Cognitive Neuroscience, Cambridge University.
Professor Julia Davidson, Department of Law and Criminology School of Childhood and Social care, University of East London.
Professor Peter Etchells, Professor of Psychology and Science Communication, Bath Spa University.
Professor Sonia Livingstone, Professor of Social Psychology and Director of Digital Futures for Children, London School of Economics and Political Science
Dr Elena Martellozzo, Professor of Criminology, Middlesex University London.
Dr Amrit-Kaur Purba, Senior Research Associate, Digital Mental Health Group, Cognition and Brain Sciences Unit, University of Cambridge.
Professor Jessica Ringrose, Professor of Sociology of Gender and Education, UCL.
Dr Mariya Stoilova, Manager of Digital Futures for Children, London School of Economics and Political Science.
Dr Kim R. Sylwander, Postdoctoral Research Officer, Digital Futures for Children, Department of Media and Communications, London School of Economics and Political Science.
Professor Lorna Woods, Professor of Law, Essex University.
Dr Jun Zhao, Senior Researcher, Computer Science, Oxford University.
Relevant DFC and other readings
Rahali, M., Kidron, B., and Livingstone, S. (2024). Smartphone policies in schools: What does the evidence say? Digital Futures for Children centre, LSE and 5Rights Foundation. Click here to access the report.
van der Spuy, A., Witting, S., Burton, P., Day, E., Livingstone, S. & Sylwander, K.R. (2024). Guiding principles for addressing technology-facilitated child sexual exploitation and abuse. Digital Futures for Children centre, LSE and 5Rights Foundation. Click here to access the report.
Livingstone, S. and Stoilova, M. (2021). The 4Cs: Classifying Online Risk to Children. (CO:RE Short Report Series on Key Topics). Hamburg: Leibniz-Institut für Medienforschung | Hans-Bredow-Institut (HBI); CO:RE - Children Online: Research and Evidence. Click here to access the report.
Stoilova, M., Bulger, M. and Livingstone, S. (2023) Do parental control tools fulfil family expectations for child protection? A rapid evidence review of the determinants and outcomes of use. Journal of Children and Media, 18(1): 29-49. Click here to access the report.
Stoilova, M., Livingstone, S., and Khazbak, R. (2021) Investigating risks and opportunities for children in a digital world: A rapid review of the evidence on children’s internet use and outcomes. UNICEF Office of Research-Innocenti. Click here to access the report.