Forum Institutional

The data debate: Rights of the individual vs institution

Big data, privacy, consent, security, governance

Data is valuable raising questions about its ownership Image: UNSPLASH/ev

Antonio Zappulla
CEO , Thomson Reuters Foundation
This article is part of: World Economic Forum Annual Meeting
  • Digital progress has not necessarily resulted in digital equity.
  • Future data governance needs to move beyond questions of ownership and control.
  • It needs to address the power equilibrium between the individual and the institution.

Data has become the world’s most valuable currency. It is being produced at an unprecedented scale and pace. In just one day, roughly 500 million tweets are posted. Around 294 billion emails are sent. Five billion searches are made. Every day, we produce 2.5 quintillion bytes of information.

The power of data is immense, but the consequences of willingly or unknowingly giving away our data – be they to our privacy, security or society in general – are presenting a raft of challenges to our fundamental human rights that would have been unthinkable a decade ago.

Have you read?

Issues surrounding data and its impact on human rights are still new to public discourse. Some of the more searching questions are not immediately evident. With this in mind, here are four critical questions we need to answer.

Who really “owns” our data?

Thanks to the backlash following the Cambridge Analytica scandal, the question of data ownership is familiar. The tide of public concern over use, misuse or mismanagement of personal data has led to regulations such as the GDPR in the EU.

But data ownership goes beyond the binary relationship between “you” and “your data”. Unlike other property, data can be owned by multiple people simultaneously. You can give it away and still have it yourself. Moreover, it might be about you, but also about other people too. The presents you’ve just bought your friends and family online means you’ve now shared information about who they are, what they like and where they live.

Then there is the complicated issue of how this data is generated. The link between data you actively or passively supply is clear. But in the future, could machine-generated data about you be considered personal data? Who would own that?

Finally, what happens when your data is part of a data set – gathered by an institution like the NHS. Does the data belong to you, the group, those who collect it and might add to its value?

The definitive answer is that the question of data ownership may not be the right one to ask.

Does consent mean control?

You’ve consented to the use of cookies or to sharing information on Facebook, reassured in the terms and conditions that you own all of the content and information you post, but controlling when you share personal information doesn’t necessarily mean controlling how it is then used.

Your data – combined with other data – is used to inform algorithms, which then make assumptions about you – whether it be your salary, health or suitability for a job. Power has shifted from the consumer to the company.

Take the example of Chinese company Ping-An, the world’s biggest insurer, which enrols customers through facial recognition. The software they use can calculate the percentage of a person’s body fat, which determines their premium. This may be helpful to the business, but what does it mean for the consumer?

Perhaps the concept of “privacy” is fast becoming outdated

There is no doubt that assumptions about us made by algorithms can be useful, but they might be inaccurate or unfairly weighted. Take, for example, what is happening in the US, where in certain jurisdictions, risk-assessment algorithms are being used not only to predict where violent crime might occur, but whether offenders might re-offend – something used by judges to determine sentencing. Here, individuals are not only being judged by their own data, but on the basis of their similarity to others.

This is linked to the issue of transparency when it comes to how algorithms are optimized. For example, US regulators are currently investigating whether Apple’s new credit card is biased against women after a complaint that one user was offered a spending limit 20 times higher than his wife, who actually had the better credit score. When this was questioned, the response was to blame the algorithm without explaining the discrepancy. “Consent” here is meaningless, and certainly doesn’t lead to “control”.

What about our privacy rights?

If the concept of “consent” is flawed, then so is the argument about privacy. You may have no desire to share pictures of a party you went to last night, but what happens when other partygoers share their photos on social media and you are in them. Your decision to opt out does not control data being shared about you.

The only real control you could exert would be to keep your data entirely private. But opting out can mean missing out, and realistically is virtually impossible to put into practice. Besides, your lack of data won’t stop assumptions being made about you, whether by business, government or institutions that collect vast amounts of data about other people.

We cannot just extrapolate from the human rights framework the right to privacy, and then try to apply that to the digital domain. We are as affected by data about other people as we are by our own. So perhaps the concept of “privacy” is also fast becoming outdated.

Data: The price of personal information
Image: Statista

Can we be compensated for our data?

How can we ensure a more equitable outcome from today’s data economy, where individuals are not benefiting from the exchange of their own data? This becomes even more complex when we understand individualized data is not as valuable as collective data. Health data sets, for example, hold huge social and economic value. But how are the people who created it being compensated for it?

There is an argument to suggest that data should be treated as labour, in which individuals are active participants in handing over their data in a transparent way and rewarded for their contributions collectively through data “unions” or “trusts”. Not only would this ensure better quality data being used to inform digital systems, but a fairer economic system, narrowing the huge divide between the big technology companies and the individual data suppliers.

Increasingly there are calls for regulation too, such as creating a Bill of Data Rights, which would protect, for example, data being used for unreasonable surveillance or for unfair discrimination.

Overall, digital progress has not necessarily resulted in digital equity and while there is no doubt that the technological revolution has empowered civil society with its free communication, connection and information, it has also become a means of misinformation, surveillance and control.

Future data governance needs to move beyond questions of ownership and control. It needs to recognize the enormous collective benefits to society in data collection, as well as distinguishing between individual data and large data sets, while also balancing the interests of the individual and society. Most of all, it needs to address the power equilibrium between the individual and the institution.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Internet Governance

Share:
The Big Picture
Explore and monitor how Internet Governance is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

3:00

This social enterprise uses basketball to get young people back into school

Here's how to mobilize for Sustainable Development Goal 14 ahead of UN Ocean Conference 2025

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum