Why the tech giants of Silicon Valley must rebuild trust after explosive beginnings
Youtube has angered content creators with opaque rules on harmful content. Image: REUTERS/Kim Hong-Ji
For the past few decades, we have been in a state of collective awe at the pace at which technology has transformed our lives, happily consuming innovation after innovation with no questions asked; pining over the next phone, the next upgrade, the next interface, the next platform.
However, events over the past several years have led more and more people to start asking questions. From the impact of fake news on election results, to the implications of automation on jobs, to the rise of internet addiction and mental health issues, and the proliferation of cyber security vulnerabilities, a number of unintended consequences are emerging in the wake of global tech transformation.
As a result, the tech community is facing a chorus of disapproval about the impact its products and services are having on people’s day-to-day lives and society at large. In other words, Silicon Valley’s “social licence to operate” is in question.
What is a social licence to operate?
A social licence to operate (SLO) generally refers to the ongoing acceptance and approval of a commercial entity by local community members and other stakeholders that can affect its profitability. Achieving societal acceptance and approval is not a new challenge for the private sector. Many traditional industries have struggled with maintaining their SLO for decades.
Take the mining industry. When natural resource extraction became ubiquitous, global economic development accelerated and those industries enabled new opportunities for nations and communities all over the world.
At the same time, an array of unintended environmental and social impacts ensued, resulting in pressure from various stakeholder groups for the industry to secure and maintain their SLO. As a result, the mining industry has been through an ongoing, long-term transformation in terms of the way in which it relates to, and partners with, stakeholders.
Researchers from the mining industry have investigated this area and determined that SLO is equivalent to trust, which is achieved through four main avenues: the actual impacts of the company on their stakeholders; the quality and quantity of the engagement between companies and stakeholders; and the perceived fairness of company policy and activities.
Research on SLO shows that the single biggest success factor in establishing trust and ultimately maintaining a social licence was procedural fairness, or the perception that the way in which a situation or person was treated was fair.
The second most important factor was the contact quality, or the extent to which a stakeholder leaves an interaction with the company feeling the company has genuine intentions to engage with them.
While the high-tech and mining industries are drastically different in many ways, it’s possible that the paths to securing and maintaining a social licence are similar.
Procedural fairness in platform governance
Maintaining an SLO is not only about ensuring big governance decisions align with corporate values and societal expectations; it’s about a commitment to ensuring the process by which key decisions are made is both actually fair and is perceived as fair.
Take the so-called demonetization of YouTube channels, which removed advertising from channels featuring “inappropriate content”, for example. Over the past several years, YouTube has struggled to address growing concerns that the platform enables harmful content, such as hate speech, anti-Semitism, and extremism.
It also attracted scrutiny from key advertisers who did not want their brands associated with inappropriate content. YouTube's leadership made a values-based and commercial decision to demonetize YouTubers who do not comply with company content policy in those areas, which is not a bad thing.
The risk to YouTube’s SLO stemmed from how the platform decided to demonetize certain content; the process by which content is moderated and deemed inappropriate; and the communication between Youtube and its users.
Many platforms are using a combination of algorithms and human moderation to address the issue of inappropriate content; flagging and demonetizing non-compliant content automatically without any nuanced due diligence or discussion with the content creator.
Sometimes this results in very positive and safe content getting demonetized, while other questionable content remains untouched.
One Youtuber said in a Guardian article: “Youtube’s policy is just very vague, which makes sense because I think demonetization needs to be handled on a case-by-case basis. Their policies seem more reasonable when you ask a human to check it, but the algorithm that catches videos originally is really unfair.”
He added: “I can’t trust Youtube anymore”. He wants Google (the company owns YouTube) to be more open about how exactly they moderate content. “I want them to be transparent about what they think to be advertiser friendly.”
By investing in an ongoing dialogue directly with user communities and other stakeholders, including advertisers, about the issues, potential changes to content policy and co-creating a path forward, all stakeholders involved would be involved. They would have a place to go to air their concerns, talk through issues, and troubleshoot problems, without going through the lengthy (and in some cases inaccessible) process of submitting a formal appeal.
Focusing on how to create a fair process for content moderation and policy decisions would enable Youtube to live up to their corporate values, while at the same time build and maintain trust with their key stakeholder groups.
Quality stakeholder engagement
If quality engagement is a core contributor to building trust and ultimately, a social licence, then, in the context of Silicon Valley, the interaction between a company, its users and stakeholders needs to reinforce trust and the sense that the company has genuine intentions to engage with its users.
Consider the standard terms and conditions screen that appears to confirm user consent. These are often written in verbose, inaccessible legal prose. This does not generate any goodwill or sense that the company truly wants users to understand the terms and conditions they are agreeing to when using the products. This is low quality contact between a company and a stakeholder.
In the short term, we all might click through these screens so we can quickly get to our urgent content, posts and likes, and this is a legally sound agreement between supplier and consumer.
And yet, over time, as we have already seen, these types of low quality interactions erode trust and ultimately leave nothing to fall back on when something goes wrong, whether it is a data security breach, a misuse of personal data, the development of an internet addiction, the experience of cyber bullying, bias in algorithms, or an emerging threat no one has yet seen.
That’s why, even when a company solves one of these problems, the stain on its reputation remains and the hits it takes to its SLO keep accruing. Without trust as a foundation, no corrective actions after the initial error will matter.
So where do we go from here?
The good news is, the industry is already starting to tune into the unforeseen social risks and challenges that come with rapid innovation, and are increasingly taking steps to prioritize their SLO.
For example, Tim Cook has become increasingly vocal about the negative societal impacts of the industry and reaffirms his focus on building and maintaining trust with all stakeholders, most of all, users.
The next step is shifting from being reactive as issues emerge, to proactively ensuring stakeholders are heard and involved in key processes that affect them, building trust-based relationships and, where possible, finding solutions to the potential negative impacts in collaboration with stakeholders. This means applying an SLO lens across the organization from business model decisions, to corporate policy, to new product design, to purchasing and hiring decisions.
This may be a challenge to the innovate-at-all-costs (or “move fast and break things”) model that is often given credit for success in Silicon Valley. But consider that the majority of large tech companies today are based on business models where trust is a central feature. Trust is so central to new business models that without a long-term and resilient social licence, Silicon Valley could eventually fail.
Unlike traditional linear business models, which depend on supply and demand, platform business models can’t exist without the acceptance and approval of their users, user communities, consumer advocacy groups, and others.
While incumbent platforms are indeed very powerful today, unaddressed tensions between management and users over time can lead to user revolts or even migration to new, more transparent platforms.
The Reddit Revolt of 2015, for example, was largely about underlying tensions between subreddit moderators and Reddit management; tensions which can be boiled down to a lack of procedural fairness and low quality communication, as highlighted explicitly in a petition launched by users calling for then CEO, Ellen Pao, to step down.
Similarly, the festering frustration about demonetization among Youtubers is leading some prominent Youtube creators to speak publicly about desires to see new platforms for content creators to emerge, which could pose a serious commercial risk to Youtube’s model over time.
Reddit and Youtube are just two examples of the very human dimensions of these increasingly automated platforms and the need for incumbents to focus on maintaining their SLO. Because of this, Silicon Valley has every incentive to solve the SLO issue, leveraging the same ingenuity, innovation and hustle which allowed the industry to flourish in the first place.
Instead of making the same mistakes as the industries of the past, Silicon Valley and its global counterparts could create a new path forward, learning from industry 'elders' and apply the same innovative spirit and ingenuity to shaping the role of the private sector in society.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Digital Communications
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Emerging TechnologiesSee all
Filipe Beato and Jamie Saunders
November 21, 2024