How to restore trust in data
Our digital relationships need a trust upgrade. Image: Alexander Sinn/Unsplash
- Data is increasingly prevalent and important in today's digital world
- Yet the threats of data breaches, identity theft and fake news can cause consumers to lose trust
- Creating better data governance involves improving methods of consent
Dear digital reader: Do you trust this web site?
What about the site you have open in the next tab—the one from the e-commerce company that holds your credit card information and knows your shopping, reading or travel habits better than you do? Or the social media platform to which you’ve turned over who-knows-how-much personal data?
If you’re like an increasing number of people, the answer is probably, “Not so much.”
Data breaches. Identity theft. Fake news. Censorship and spying. It’s little wonder that surveys show trust in the digital economy is eroding, and legislators from Europe to California are enacting restrictive data-privacy laws. Even when the businesses and other entities that collect our data play by the rules we’ve agreed to… well, what exactly have we agreed to, anyway? Where is our information actually going and how is it being used?
Data is a powerful source of innovation and prosperity, but the systems designed to govern it are not working nearly as well as they should. As we enter second decade of 21st century, one of the most important questions for the world is: How do we restore trust in the digital economy?
Finding the right answer will be crucial.
Data is so important to modern life that it’s become a commonplace to call it “the new oil.” Unlike oil, however, data isn’t finite. It can copied and shared, and exploiting it doesn’t use it up. Exchanging and combining data—not “consuming” it—is how we create value. But when trust in the laws and technology that govern data breaks down, barriers to sharing go up. And that means less value, for people, businesses and society at large.
So how to restore that trust?
Like the traditional economy, the digital economy consists of a web of relationships—between individuals and businesses, businesses and governments, governments and citizens, and the governments of different countries. Each relationship is a vector of trust, with its own unique characteristics and requirements to keep them functioning and healthy.
Take the relationship between individuals and the businesses that collect, use and profit from their data.
The conventional way to ensure trust between these parties is consent—the agreements we enter that allow service providers to collect particular kinds of data and use it in a particular way. But as anyone who’s clicked “agree” on an online form without reading it knows, consent is an imperfect tool. Media exposés of data misuse show how much data-collectors can get away with without technically violating consent.
And even if you know what you’re agreeing to, will you still feel the same tomorrow, or in 10 years? What about people with diminished capacity to make consent decisions, such as elderly people suffering from dementia? How do we include them in the digital economy without compromising their already precarious autonomy?
Enhancing trust requires rethinking consent. Luckily, technical tools are emerging that could allow for a more flexible, tailor-made approach. Data exchanges, regulated by governments and secured by advanced encryption techniques such as blockchain, could allow people to decide what kinds of personal data they want to share, with whom, and under what circumstances—and to reap rewards for doing so. A cancer patient, for instance, might allow her medical records to be shared freely with academic researchers but require stricter privacy protections or financial compensation before the data could be made available to a for-profit drug company. Fair, broadly applicable default settings could counter “consent fatigue” and keep people from being overwhelmed.
Other digital relationships need a trust upgrade, too.
Your government’s right to access and share private data depends on the circumstances, and as more aspects of our lives are turned into digital information, we need a new consensus about when exceptions to normal privacy rules should apply. Even the most stringent privacy regulations, like the European Union’s General Data Protection Regulation (GDPR), contain exceptions for things like public-health emergencies or criminal investigations. Trustworthy, human-centric data governance must find a balance between the rights of individuals, the interests of data-exploiting businesses and the welfare of society as a whole.
Unfortunately, no current data regime gets the balance right. In the lightly regulated United States, the digital economy is led by a few massive big-tech businesses. Europe has countered with strict privacy regulations and assertive antitrust enforcement, but to a degree that risks sacrificing openness and innovation. And other countries are pursing walled-off, state-controlled models that are a different beast entirely.
This regulatory fragmentation among national and regional authorities is a problem in itself—and other manifestation of weakening trust.
It is not simply a matter of different and sometimes incompatible rules. More and more countries are imposing “localization” requirements that directly restrict the sharing of data access borders. Their reasons are varied and include concerns about privacy and national security as well as (unstated) protectionism. Keeping data walled off is seen by some as a way of nurturing local digital industries. Foreign competitors, and local consumers, lose out as a result.
International coordination is vital to addressing these issues.
The need to address data governance on a global scale was recognized by leaders of the Group of 20 nations at their 2019 summit in Osaka, Japan. Japanese Prime Minister Shinzo Abe declared that he wanted the meeting to be “remembered as the summit that started world-wide data governance.” One product of the G20 discussions was the so-called Osaka Track, a new overarching framework for promoting cross-border data flows in a range of global forums, including the World Trade Organization. Its guiding principal is the idea of Data Free Flow with Trust (DFFT).
The idea recognizes that trust and openness are two sides of the same coin. As Prime Minister Abe explained at the Annual Meeting of the World Economic Forum in Davos-Klosters, Switzerland, in January 2019, is only by bolstering trust—in privacy, cyber-security and the adherence to broadly compatible rules-based systems—that we can ensure the free flow of data that is crucial to creating new value for the world in the Fourth Industrial Revolution.
This article was first published on La Stampa.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Tech for Good
Related topics:
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on CybersecuritySee all
Kate Whiting
December 12, 2024