Fourth Industrial Revolution

6 ways tech can earn our trust

Virtuous cycles ... to succeed, the Fourth Industrial Revolution needs us to believe in it.

Hilary Sutcliffe
Director, SocietyInside

The slightly plaintive question ‘How can we restore trust?‘ is on everyone’s lips. The answer is pretty obvious. First: be trustworthy. Second: provide others with good evidence that you are trustworthy.”

This simple advice from moral philosopher Baroness Onora O’Neill, the 2017 winner of the $1million Berggruen Prize, cuts to the heart of trust – what matters is what you do, not (just) what you say.

The issue of trust is central to the success of the Fourth Industrial Revolution, but solutions to building that trust most often focus on better communication about benefits or improving education so that the ignorant masses see the light and understand that their worries are unfounded. A conversation about the trustworthiness of the technologies underpinning the Fourth Industrial Revolution is well overdue.

But what does trustworthiness look like? Here are six key components of trustworthiness to consider when developing the technologies that underpin the Fourth Industrial Revolution.

The wider impact matters

Except in Hollywood films and state defence departments, people rarely start out to invent a technology to deliberately cause harm to people or the environment. What are often seen by many as "irresponsible innovations" (a straw poll suggested things like DDT, CFCs, asbestos, sub-prime mortgages, opioids and palm oil, among others), started life as someone’s great idea to solve a problem and make some money. They became seen as irresponsible when negative impacts or irresponsible business practices came to light.

Palm oil is often viewed as an irresponsible innovation because of its environmental impact
Image: Antara Foto Agency/REUTERS

Some harms only emerge with the technology in use, others are knowable in advance and ignored because money for one group trumps harm to others. Some could have been designed out, if thought had been given in the development stage; others are based on inequalities and systemic issues, which need coordinated efforts to overcome.

Those leading in the development of a technology that is trustworthy have thought through these issues as far as possible in advance. They have mitigated negative effects where possible and, where not, taken steps to compensate the losers or bring together trusted groups to consider solutions.

Lead with purpose and benefit

A lack of agreement on what constitutes a benefit and whether that is worth the potential or actual risk or harm, is often behind many concerns about technologies and their applications. This was one of the fundamental issues behind the resistance to the genetic modification of plants. Many disruptive innovations see winners and losers and it is not unreasonable for the losers to protest vociferously. Sometimes the benefits and risks are not shared equally and sometimes there aren’t really any benefits to society with shareholders reaping the reward and other stakeholders shouldering the risk.

Trustworthy companies are putting social purpose first and carefully considering trade-offs, harms and risks to all stakeholders, including the planet, both in the vision for their innovations and their handling of risk. This appears to be paying off, with organisations like Unilever and their Sustainable Living Plan allowing them to engage with constructive conversations about using new technologies where less trusted companies are treated with hostility.

Be open – show your workings

If you have a child doing maths at school, you’ve probably had the talk about the importance of showing your workings. “It’s not enough just to come up with the answer,” you say, “you need to demonstrate how you came to your conclusions.”

It’s the same for building trustworthiness. It’s not now enough to say: "Our product is great and it’s safe, just trust us." Erosion of trust in business, policy and science has resulted in a greater need to demonstrate trustworthiness through greater transparency about your processes and evidence that your product or technology will do what you say it will, that it is beneficial, and you have thought through the risks.

How to do that effectively is not easy and is often not just the job of business, but policy and academic science as well. But some are taking innovative steps by being more open about the approach to technology, such as the BASF Dialogue Forum Nano; and participating in multi-stakeholder standards and regulation development, such as the IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems. Others are demonstrating trustworthiness through advanced social reporting, entering equal partnerships with trusted intermediaries, and being more open about product safety and testing, such as GSK’s Clinical Study Register, which discloses both positive and negative findings of all its research and encourages data sharing with academia and competitors.

Involve people

The lone genius model of innovation has been debunked and it is increasingly recognised that innovation works best when we develop it together with a clear purpose. Working with like-minded academics or patient groups is one thing, but involving NGOs and the general public is often seen as a step too far. Henry Ford's alleged quotation – “If I had asked people what they wanted, they would have said faster horses” – and Steve Jobs' secretive, but successful approach are used to ridicule any such suggestion.

But that was then and this is now. Some of the innovations are evolving so fast that only a commitment to really listening to others’ broader points of view can allow organisations to mitigate negative impacts in advance and prepare them for the way their technologies will be received.

Nobel prize-winning economist Daniel Kahneman urges us in response to criticism, "not to persuade, but to assess the source of resistance and address that". How will you know what the potential source of resistance is without getting out there and asking, let alone how to address it?

Have you read?

Welcome warnings

“We didn’t know people were going to do that with our technology” is a defence used by social media companies in relation to their shifting responsibilities. The history of innovation shows it is indeed very difficult to predict where a technology will lead and what applications will arise from its use. However, spotting early warnings of negative impacts when they arise does not seem to be too difficult – responding in a timely fashion seems to pose a greater challenge.

Human beings have a strong inclination to ignore early warnings of disaster if they clash with our world view or economic incentives, as the European Environment Agency’s Late Lessons from Early Warnings reports powerfully remind us.

Trustworthy companies know this and they welcome early warnings. They seek them out; the listen closely to dissenting voices; they reward those who bring problems to their attention; they have processes in place to anticipate them; and, where something does go wrong, it is resolved swiftly and responsibly and lessons are learned and processes refined in response.

It’s not unreasonable not to know where innovation make take you; it is unreasonable not to respond in a prompt and decisive fashion when harms are suspected or, in too many cases, widely known.

Embrace responsibility

Trusted and trustworthy companies embrace their responsibilities. They see this open and honest approach as a source of strength and reputational capital. They don’t try and fudge them, wriggle out of them through legal or PR means, or subvert their processes to avoid them. Time and again we see that it is not a specific problem that damages a company’s or an individual’s reputation but their response when something goes wrong. Irresponsible actors prevaricate, avoid responsibility, try and smooth over, ignore or deny problems; or sometimes subvert justice and decency to try and evade responsibility. Trustworthy companies have taken steps to understand their often shifting responsibilities and, if a problem arises, deal with it honestly and openly with authentic communication and appropriate solutions.

Trustworthiness, as Baroness O’Neill says, is built on evidence that you are trustworthy. Now more than ever we need the technologies and businesses shaping our world to demonstrate they deserve our trust.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Values

Related topics:
Fourth Industrial RevolutionEmerging Technologies
Share:
The Big Picture
Explore and monitor how Values is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

We asked 5 tech strategy leaders about inclusive, ethical and responsible use of technology. Here's what they said

Daniel Dobrygowski and Bart Valkhof

November 21, 2024

Why is human-first design essential to the future of the internet?

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum