Why data is key to protecting kids online and ensuring the digital future we deserve
Solid data and multi-stakeholder public-private platforms are key to driving forward the online child safety agenda. Image: Pexels
Listen to the article
- One of the most critical threats to children is the alarming growth in online child sexual exploitation and abuse.
- The key question is how to balance trade-offs between efforts to detect and act on harmful content and protect privacy when there's a conflict.
- Data can help address this tension by driving informed decision-making that takes priority over emotional or ideological approaches.
Today, children’s lives are an interconnected web of in-person and digital experiences that feed and merge seamlessly. While digital technologies have opened up new worlds for children that we could hardly have imagined when we were growing up, at the same time these technologies have also exposed them to threats and dangers far beyond traditional childhood violence. This includes online harassment and violence, exposure to harmful online content – such as violent, sexual, self-harm and hate speech – as well as concerns around the unethical use of their digital data.
However, one of the most critical threats to children is the alarming growth in online child sexual exploitation and abuse (CSEA).
Rights-respecting approach
Children worldwide are being groomed to share intimate images, coerced into performing explicit sexual acts and having images of their sexual abuse shared online – all of which cause devastating harm to children’s lives and wider societies. Disrupting Harm, a first-of-its-kind large-scale, multi-country research project on children’s experiences with digital technology highlights that between 1-20% of children (12-17 years of age) were subjected to CSEA in 2020 across 13 countries in Eastern and Southern Africa and South East Asia. Scaled to the general population of internet-using children, these estimates represent millions of children in each country. In an alarming new report, the Internet Watch Foundation (IWF) says that sexual abuse imagery of primary school children has risen 1,000% since 2019.
The recent spate of progressive legislative actions by world leaders is an attempt to respond to the new digital era’s existing and emerging challenges. The proposed EU legislation to prevent and combat online CSEA, the EU Digital Services Act (DSA), the UK Online Safety Bill, the California Age-Appropriate Design Code Act and Australia’s e-Safety Commissioner, which is the world’s first government agency created to keep people safe online, are efforts to shape the contours of a rapidly changing world. They set high standards to protect children online through policy action and incentivize the technology industry to prioritize child safety.
As laid out in the World Economic Forum white paper, “Global Principles on Digital Safety”, creating a safer digital ecosystem necessitates a rights-respecting approach, requiring adherence to international human rights law and balancing human rights, such as safety, privacy, freedom of expression, and others.
Acknowledging that digital safety requires a range of deliberations balancing legal, policy, ethical, social and technological considerations is vital. Such decisions must be rooted in existing human rights laws and principles that recognize that fundamental human rights are indivisible.
To create a safer digital ecosystem, regulators must collaborate with other players in the ecosystem, necessitating multi-stakeholder cooperation. Any action taken should be grounded in necessity, proportionality, and legality, and decisions should be made with a comprehensive understanding of the various stakeholder's interests and concerns.
The fact is that privacy and safety are intrinsically interconnected and, most of the time, mutually reinforcing. In many cases, users’ privacy is a prerequisite for safety. Some privacy measures, however, can complicate children’s safety. For instance, the polarizing debate on End-to-End Encryption (E2EE) of digital communications often ignores the situation’s complexities and the fact that there are legitimate concerns on both sides of the ‘E2EE divide’. While the weakening of E2EE could endanger the confidentiality and security of communications, it will also severely impede efforts to monitor and remove online child sexual abuse material and identify offenders.
Efforts to tackle online child sexual exploitation and abuse, and ensure safe and age-appropriate experience have led to development of new privacy preserving technologies such as Client-Side-Scanning and various on-device Age Estimation tools. But concerns remain about their impact on user privacy and that needs to be acknowledged. The key question is how to balance the trade-offs between efforts to detect and act on harmful and illegal content, and protect privacy when there is a conflict.
Data can help us address this tension: it is a fundamental asset to drive informed decision-making and needs to take priority over emotional or ideological approaches.
Role of data in vital new insights
Firstly, we need reliable data on children’s experiences in the online world as well as surrounding contextual factors to ensure that all parts of society, including governments and industry, can respond to emerging threats together.
Currently, there is limited understanding of how children engage with digital technology in countries across the globe, and little evidence on what forms of online abuse children experience, who is more vulnerable, who the common offenders are, and whether children know how to recognize and report online abuse. Large-scale projects like Disrupting Harm demonstrate the role of data and evidence in generating vital new insights on online CSEA, which are crucial in addressing the issue.
Secondly, we need more data on the prevention and response side, particularly on national child protection systems and how they respond to the rapidly growing scale of the problem, and whether these policies and practices fit the digital age.
Finally, we need data that can help us leverage cutting-edge technologies to overcome polarized positions on privacy and safety. For instance, a decision on whether to implement Client-Side-Scanning technologies to identify child sexual abuse material (CSAM) material could be made considering objective data such as: i) benefits in terms of number of CSAM material identified and removed; ii) potential prosecution of abusers, hence preventing re-occurrence of crimes; iii) share of false positives; and iv) occurrences of unwanted usage of the technology.
How is the Forum tackling global cybersecurity challenges?
Unfortunately, you will rarely hear people use these metrics in the debates around privacy and safety, which is making progress in this area even more difficult.
Together with partners, the Safe Online Initiative at End Violence is set to address this issue. The recently held Safe Digital Futures: Data for Digital Rights event at the European Parliament highlighted that the lack of systematic data collection and infrastructure for analysis of online risks and harms for children remains a critical barrier to securing a safe internet for children. Solid data and multi-stakeholder public-private platforms like the Forum’s Global Coalition for Digital Safety are key to driving forward the online child safety agenda.
The world watches in anticipation as upcoming innovations on the technological horizon pave the way for transformative changes in the online world. The challenges in the future may be more complex than simply chatbots professing their love for humans and may impact children in ways that we never imagined. New technology is exciting, but we cannot afford to play catch-up yet again.
Regulators’ policymaking – as well as company decisions about product features, content and conduct moderation, and other areas – require a deep understanding of the implications to all stakeholders, especially children, supported by data. And the only way for that to happen is through public-private collaboration with an open mind.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Internet Governance
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on CybersecuritySee all
Filipe Beato and Jamie Saunders
November 21, 2024