Executive summary
Recent years have seen significant developments in legislation and regulation covering children’s privacy and safety in response to growing public concern and evidence of risks to children online.
The UK Age Appropriate Design Code (AADC) took effect in 2021, and the Online Safety Act (OSA) passed into law in 2023 and is now in a transitional period. The European Union (EU) has passed the Digital Services Act (DSA), which took full effect in 2024. More established legislation such as the General Data Protection Regulation (GDPR) and the US Children’s Online Privacy Protection Act (COPPA) also continue to play a role in regulation.
The report seeks to understand how these new developments in legislation and regulation may benefit children’s digital lives. It examines the impacts of legislative and regulatory measures focused on children’s online privacy and safety over the period 2017–24.
Research findings and conclusions
The research made the following findings about the changes announced by Meta, Google, TikTok and Snap:
- 128 changes were recorded during the period 2017–24.
- A peak of 42 changes was recorded in 2021, the year the AADC came into effect.
- Meta was the most active company – announcing 61 changes.
- The highest OECD risk category was content risk – 56 changes – followed by cross-cutting (41), contact (16), consumer (11) and conduct (4).
- The highest category change was ‘by default’ – 63 changes – followed by tools (37), information (21) and support (7).
It is reasonable to conclude that legislation and regulation is driving the companies to make significant numbers of important child privacy and safety changes. These can provide substantive benefits in protecting children online. However, further research is needed to assess the full extent of the benefits. Further assessment is also needed as the DSA and OSA are fully implemented through 2025 and 2026.
Some of most important changes recorded, linked to legislation and regulation, included social media accounts defaulted to private settings, changes to recommender systems and restrictions on targeted advertising to children.
The research also revealed that companies are significantly relying on tools such as parental controls in response to legislation and regulation. While there is a valid relationship between the use of tools and the requirements in the AADC, GDPR and DSA, there is a risk of over-reliance as a privacy and safety measure. The evidence indicates low levels of use and efficacy for parental controls, plus risks to child autonomy. This therefore presents a risk of reliance to the exclusion of other measures.
The report also notes the risk that changes to age assurance and recommender systems could impact on other rights that children have. The impacts of these changes on rights, such as freedom of expression and non-discrimination, will need to be carefully monitored.
We have observed a significant number of changes. Previously, the question was whether companies were making enough changes. Over time the regulatory questions will focus on whether the solutions are effective. Therefore, regulators will need to be equipped to handle these complex questions.
The research revealed a significant challenge in gathering information about changes made by companies, and highlights a significant transparency gap that regulators and companies should address.
The Digital Futures Commission (DFC) is committed to undertaking a further research project, with the next report to be published in early 2026. This would also draw on evidence from the DSA and OSA’s transparency measures, made available over 2024 and 2025.
Image credit: Photo by Enokson on Flickr