EU Digital Services Act: How it will make the internet safer for children
The Digital Services Act is likely to set a global standards for a safer internet. Image: Unsplash/Annie Spratt
Listen to the article
- The EU has agreed on the Digital Services Act, which aims to set a new rulebook on how Europe regulates Big Tech and the digital landscape.
- The legislation includes a number of measures such as recognizing the rights of the child and a ban on targeted advertising aimed at children.
- While it applies to those working and residing in the EU, it is expected to set global standards for a more transparent and safe online environment.
European lawmakers recently reached a landmark political agreement on the Digital Services Act (DSA), an ambitious new rulebook for how Europe regulates its digital landscape, especially Big Tech.
The wide-ranging plan includes how companies – online platforms, search engines, online marketplaces and every other significant provider of digital services – moderate and manage content, including illegal content, hate speech and disinformation, and has a number of implications for children.
While the legislation applies to those residing in the European Union (EU), much like the EU’s General Data Protection Regulation (GDPR) changed the worldwide landscape for data governance, it is expected that the Digital Services Act will set “new global standards” for a more transparent and safe online environment.
As with GDPR, the DSA applies to non-EU companies if they offer services in the EU. So in short, it’s an end to industry self-regulation there and a big deal for the future of the internet.
How will the Digital Services Act impact children?
While the final text is not yet available (and is still being ironed out), from the initial proposal from 2020 – UNICEF was one of 2,863 organizations that responded to the public consultation – and subsequent amendments and EU statements, we know the following about what it could mean for children:
- Clear recognition of the rights of the child. The Convention on the Rights of the Child and General Comment No 25 on children’s rights in relation to the digital environment, which formally sets out how these rights apply to the digital world, are both referenced. A child here being anyone under the age of 18 years old.
- The swift removal of illegal online content, which includes child sexual abuse material, illegal hate speech, terrorist content or an illegal product. Victims of online harassment will be better protected against unlawful non-consensual sharing of private images with immediate takedowns. UNICEF’s research conducted with ECPAT and INTERPOL in 13 countries around the world indicate image-based sexual abuse to be a prevalent crime against children.
- Risk assessments for impacts on rights, including those of children. Very large online platforms (VLOPs) – those with more than 45 million monthly active users – are held to stricter regulation because “the greater the size, the greater the responsibilities of online platforms”. They must conduct risk assessments at least once a year to gauge any negative effects on privacy, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in key articles of the Charter of Fundamental Rights of the European Union. The reference here relates to children having the right to the protection and care as is necessary for their well-being; and that private institutions must always act in the child's best interests. Audits must also consider how Big Tech’s algorithms affect the physical and mental health of minors.
- A ban on targeted advertising at children and restrictions on data harvesting for profiling. For all users, targeting with online ads based on their sexual preference, health information, religion and political beliefs will be disallowed.
- Greater understandability. For services that are primarily directed at minors or is pre-dominantly used by them, the platform must explain terms of use in a way that minors can understand.
What does the Digital Services Act mean for everyone, including children?
General requirements, which would also benefit children, include:
- A ban on "dark patterns" that seek to manipulate users' choices through deceptive or confusing user interfaces (think nudges or giving visual prominence to a particular choice). With the DSA, "cancelling a subscription for a service should become as easy as subscribing to it”.
- Greater algorithmic accountability. The European Commission and its member states will have access to the algorithms of VLOPs. Furthermore, vetted researchers, not-for-profit bodies, organizations or associations will have access to the key data of such platforms to “provide more insight into how online risks evolve”. Ideally this will lead to an increase in age-disaggregated data to improve the knowledge of tech’s impact on children.
- More algorithmic transparency and choice for users. VLOPs will be required to make the working of their recommender algorithms transparent to users. Furthermore, users will be able to select a recommender system “not based on profiling”. For example, you could see your social or news feed in chronological order, not based on algorithmic ordering.
- More detailed moderation reporting. Companies will be required to “release detailed biannual reports of their moderation efforts, including the number of staff, expertise, languages spoken and the use of artificial intelligence to remove illegal content”.
- Better reporting, dispute mechanisms and compensation for users. Users will have better means to report harmful content, as well as to appeal decisions made about the removal of their own content. Recipients of digital services will have a right to seek redress for any damages or loss suffered due to infringements by platforms.
- Swift action on misinformation. VLOPs will need to introduce new strategies for dealing with misinformation during crises (a provision apparently inspired by the war in Ukraine). This is welcomed given how children may be especially susceptible to mis/disinformation.
- Large penalties. Companies that don’t comply can be fined up to 6% of their worldwide turnover.
How will the Digital Services Act be implemented?
A new role of digital service co-ordinator will be created within EU each member state to independently regulate the Digital Services Act. However, the specifics of these regulators have yet to be fully developed, creating room for interpretation around how to enforce the legislation. As is often the case with regulations, the devil is in the details.
A key feature of the DSA is that it broadly does not “dictate what legal content digital platforms can and can’t allow online” – rather, it “aims to set procedural standards for dealing with illegal content and ensuring that companies are applying their own terms and conditions fairly and consistently”. What is considered illegal will depend on both EU and national legislation.
What are the next steps in its implementation?
Once formally adopted by the EU co-legislators, the DSA will apply after 15 months, or from January 1, 2024, whichever is later. For VLOPs, the rules will apply sooner.
The Digital Services Act is an important first step towards tech companies taking a greater responsibility for content that appears on their platforms. Whatever the final text, it will have implications for children and those promoting their rights.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Internet Governance
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.