200 women call on tech giants to prioritize online safety. Here's how
Addressing safety for women online. Image: Victoria Heath/Unsplash
Listen to the article
- More than 200 women sign a letter calling for tech companies to "prioritize the safety of women."
- Facebook, Google, TikTok and Twitter released commitments to improve the safety of women online during the Generation Equality Forum in Paris.
- A new report from the World Economic Forum offers insights into how to advance digital safety.
More than 200 prominent women signed an open letter calling for big tech companies to “prioritize the safety of women.”
Posted on the World Wide Web Foundation on Thursday, the letter coincided with the Generation Equality Forum in Paris, which brought together governments, businesses, international organizations and civil society to make progress towards achieving gender equality. During the forum, Facebook, Google, TikTok and Twitter released commitments to improve the safety of women online, including providing easy navigation and access to safety tools and reducing the amount of abuse they see.
The letter emphasized the need for giving users greater control to manage their safety online, including who can interact with them and more choice on what, how and when they see content online. It also called for improving systems for reporting abuse.
The signatories include Julia Gillard, the former prime minister of Australia; Kathryn D. Sullivan, a US astronaut; Graça Machel, Nelson Mandela’s widow; and actors Ashley Judd, Thandiwe Newton and Emma Watson.
"The internet is the town square of the 21st century," the signatories wrote. "It is where debate takes place, communities are built, products are sold and reputations are made. But the scale of online abuse means that, for too many women, these digital town squares are unsafe. This is a threat to progress on gender equality."
Why do we need to address safety for women online?
The letter cited a 2020 Economist study that found that "38% of women globally have directly experienced online abuse." According to the study, 85% had witnessed online violence against women. Younger women were more likely to report experiencing online abuse.
The form of online abuse varies, from cyber harassment to hate speech to impersonation to misinformation and defamation, according to the study.
The COVID-19 pandemic has increased the potential abuse, as people have spent more time online during lockdowns.
The World Economic Forum's "Advancing Digital Safety: A Framework to Align Global Action" report surveyed 50 experts from academia, civil society, government and business and found that 75% agree or strongly agree that platforms do not adequately act on harmful content. This challenge increases for content that has a less clear definition of harm, with 90% saying that it is handled somewhat or highly ineffectively.
What can we do to increase online safety?
According to the Forum report, employers, advertisers, regulators and digital platforms all have important roles to play in ensuring a safe online environment. Employers can work to secure devices and train employees, advertisers can ensure safety guidelines and take more nuanced approaches to keyword usage and ad placement, and regulators can step up their efforts around monitoring and response.
Digital platforms can enable users to make more informed choices about content, increase cross-platform collaboration to counter online harassment, make safety a leadership priority, and address contracting and work environment issues with the content moderation workforce.
“When it comes to harmful content, today the metrics reported on by platforms focus largely on the absolute number of pieces of content removed, which do not provide an adequate measure of safety according to a user’s experience," says Cathy Li, Head of Media, Entertainment and Sport Industries at the World Economic Forum.
"And it certainly does not reflect the important nuance that certain groups – based on their gender, race, ethnicity and other factors, in this case women particularly – may be more exposed to harmful content," she continues. "This is why we recommend a user‑centric framework which advocates complementing current measures with metrics that focus on the user experience of safety and work to improve it over time to advance digital safety.”
The Forum's new Global Coalition for Digital Safety will bring together public and private leaders to share best practices for online safety regulation, take action to reduce the risk of harmful content and collaborate on programs to increase digital media literacy.
As the letter's signatories wrote: "Imagine what you can achieve if you follow through on commitments to build safer platforms: an online world where a journalist can engage with feedback on her reporting, not assassinations of her character. Where a politician may read complaints about her policies, but not threats of rape and murder. Where a young woman can share what she wants to on her terms, knowing there are systems to keep her safe and hold harassers accountable."
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Media, Entertainment and Sport
Related topics:
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Industries in DepthSee all
Jane Sun
December 18, 2024