UNESCO wants to develop an ‘internet of trust’. Here are its guidelines
Internet of trust ... guidelines include 5 key principles to shape thinking and behaviour in the digital world. Image: Unsplash/Ales Nesetril
- UNESCO has released Guidelines for the Governance of Digital Platforms in a bid to protect people’s rights online.
- It includes five principles that aim to prevent issues such as misinformation, disinformation, hate speech and conspiracy theories.
- The World Economic Forum’s Toolkit for Digital Safety Design Interventions and Innovations aims to harmonize universal perceptions of online threats following a fragmentation of efforts to tackle issues from cyberbullying to hate speech.
Three times as many people in the world use social media than own a car.
Yet how many more regulations are there for our roads than for governing the digital platforms we spend so much of our days using?
An estimated 60% of the world – that’s 4.75 billion people – use social media not just to express themselves and connect with others, but to inform themselves about news and global events. That’s one of the leading statistics in UNESCO’s new Guidelines for the Governance of Digital Platforms, which aim to ensure people retain their rights online and are protected from misinformation, disinformation, hate speech and conspiracy theories.
“If we can no longer distinguish fiction from reality, falsehood from truth, the foundations of our societies crumble. Democracy, dialogue and debate – all essential to address major contemporary challenges – become impossible,” UNESCO says.
“Faced with the global nature of these issues, we need to develop consistent responses around the world, and avoid the fragmentation of regulations or approaches that compromise human rights.”
What’s in the UNESCO Guidelines for the Governance of Digital Platforms?
The guidelines aim to create what UNESCO calls an “internet of trust”.
They were produced through a consultation process with 134 countries and cover the responsibilities of states, digital platforms, intergovernmental organizations, the media and everyday people to “respect, protect and fulfil human rights”.
The guidelines include 5 key principles to shape thinking and behaviour in the digital world.
Principle 1: Platforms conduct due diligence on human rights
Digital platforms need to be able to show what systems or processes they have in place to protect human rights, UNESCO says.
“Platforms should conduct periodic risk assessments to identify and address any actual or potential human rights impacts of their operations,” it adds.
Specifically, UNESCO says these assessments should take place ahead of elections, to ensure voting processes retain their integrity. There is already evidence of social media impacting election outcomes, and the rise of AI has led to fears that fake videos and disinformation spread online could swing voting results.
It also highlights the need to ensure women, girls and journalists have their rights protected. Women are more concerned than men about having their personal data misused online, and about becoming victims of harassment and abuse, according to research by the World Wide Web Foundation. Journalists can also face abuse and even death threats on social media.
Principle 2. Platforms adhere to international human rights standards, including in platform design, content moderation and content curation
The design of digital platforms and their content moderation and curation policies are the focus of this principle. There is a need to ensure “non-discrimination and equal treatment” in these areas, UNESCO says.
Microsoft is among the firms to have already flagged the need for “inclusive design” online that accounts for people’s differing physical, cognitive and emotional needs.
As for content moderation, UNESCO flags a need to address “biases, stereotypes and discriminatory algorithms or content moderation practices that affect women and girls, as well as groups in situations of vulnerability and marginalization, including indigenous communities”.
Effective online content moderation will counter hate speech to make the internet a welcoming place for all. It also means ensuring that wherever algorithms are used for moderation rather than humans, they are free of the biases that can make them racist or sexist.
Preventing discrimination based on sex, ethnic origin, socioeconomic background and other characteristics is enshrined in numerous international conventions, as the World Economic Forum’s Toolkit for Digital Safety Design Interventions and Innovations points out. The typology aims to “harmonize universal perceptions of online threats” following a fragmentation of efforts to tackle issues from cyberbullying to hate speech.
Principle 3. Platforms are transparent
While large social media platforms have terms of service and community standards, they need to be clear about how well they are sticking to these standards, including through independent evaluation, UNESCO says.
Transparency does not just mean dumping vast amounts of data or legal texts online. It’s about providing information in a manner that allows users to “make informed decisions”, UNESCO adds.
Complaints mechanisms and advertising practices fall under this principle, including how data is collected to support advertising models. The lucrative advertising market on social media should not be pursued in a way that puts people’s privacy and rights at risk.
Principle 4. Platforms make information and tools available for users
The internet operates globally, and digital platforms should take this into account by making their terms of service available in the official primary languages of every country where they operate.
They should also be able to handle and respond to user communications – including complaints – in these languages.
How is the World Economic Forum fostering a sustainable and inclusive digital economy?
Special consideration also needs to be made when children or people with disabilities are the intended or potential users. This ranges from writing in age-appropriate language to adjusting websites to make information accessible for deaf or partially sighted people, for example.
Principle 5. Platforms are accountable to relevant stakeholders
Rules are of no use unless they are followed, and ensuring this happens is the focus of UNESCO’s fifth principle.
Digital platforms need reporting mechanisms to log policy violations, but given the global nature of their operations, they also need to ensure these mechanisms take into account local cultural sensitivities in the way complaints are handled.
There also needs to be a special focus on quickly tackling any content that threatens other users – particularly their human rights, or that targets them because of their gender. Special reporting mechanisms are also needed to protect children, UNESCO says.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Tech and Innovation
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Emerging TechnologiesSee all
Jennifer Goldsack and Shauna Overgaard
November 14, 2024