Emerging Technologies

Would you give up your friend's privacy for pizza?

Rosa's Fresh Pizza owner Mason Wartman, a former Wall Street equity researcher, reaches to heat slices of pizza in Philadelphia, Pennsylvania January 10, 2015. The $1 a slice pizzeria, which opened in December 2013, has provided 8,500 slices of pizza to the homeless in the last 9 months.  With a 'pay-it-forward' approach, customers donate dollars to feed those in need.  REUTERS/Mark Makela (UNITED STATES - Tags: FOOD SOCIETY POVERTY) - RTR4KV7X

A new study has shown that privacy takes a backseat for a reward as simple as free pizza. Image: REUTERS/Mark Makela

Stanford University

Although many people say they want to protect their personal information, privacy tends to take a backseat to convenience and can easily get tossed out the window for a reward as simple as free pizza, a new study shows.

The research provides real-life evidence of a digital privacy paradox: a disconnect between stated privacy preferences and actual privacy choices. And it serves policymakers with some food for thought about how to regulate data sharing without creating more hassles for consumers.

Image: Futurity

“Generally, people don’t seem to be willing to take expensive actions or even very small actions to preserve their privacy,” says Susan Athey, coauthor of the paper and a senior fellow at the Stanford University Institute for Economic Policy Research. “Even though, if you ask them, they express frustration, unhappiness, or dislike of losing their privacy, they tend not to make choices that correspond to those preferences.”

In highlighting the distortions in consumer behavior regarding privacy, the findings suggest that safeguards, such as the widespread “Notice and Choice” policies under the Privacy Act of 1974, are not enough.

Athey and her coauthors clinched a unique opportunity to empirically explore the privacy paradox when the Massachusetts Institute of Technology launched a project in 2014 to encourage experimentation with Bitcoin by university undergraduates.

Have you read?

The researchers examined how 3,108 undergraduates played out their privacy preferences while choosing an online wallet to store and manage the digital currency. Along with the Bitcoin distribution, researchers also measured the students’ privacy preferences.

Regardless of varying levels of privacy features, the order of the four wallet options presented upon sign-up seemed to drive many of the participants’ decisions, even when the choice contrasted with their stated privacy preferences, the study found.

And it made little difference when researchers provided students with more details of each wallet’s privacy features; the influential effect of the ranking order persisted.

What’s more, students who had expressed stronger preferences for privacy—whether it was privacy from the government, the commercial provider, or the public—essentially behaved no differently than those who says privacy was less of a concern, the study found.

Pizza, privacy, and policy

To see whether a small incentive could influence a decision about privacy, researchers offered one group of students a free pizza—as long as they disclosed three friends’ email addresses.

An overwhelming majority of the students chose pizza over protecting their friends’ privacy. Differences in gender or their stated personal sensitivities to privacy did not seem to have any effect on the choice.

People “are willing to relinquish private data quite easily when incentivized to do so,” the authors of the study write plainly.

Researchers also gave students an option to add additional encryption to help secure information in setting up their wallets. Though the encryption would not have added a security benefit to future wallet transactions, the offer was meant to test whether the participants were willing to take extra steps to protect their privacy.

About half of the students initially tried to go through the extra step of adding the reassuring feature. Yet only half of that group completed the process, while the rest returned to the easier setup option without the encryption.

Altogether, the experiment results show that “consumers deviate from their own stated preferences regarding privacy in the presence of small incentives, frictions, and irrelevant information,” the authors write.

The findings provide a rare snapshot: The privacy paradox has been widely observed, but empirical evidence from a real-world setting—involving choices with real consequences—has been limited.

The study raised two policy implications.

Since the findings show consumers’ actions don’t align with what they say, and it’s difficult to gauge a consumer’s true privacy preference, policymakers might question the value of stated preferences.

On the other hand, consumers might need more extensive privacy protections to protect consumers from themselves and their willingness to share data in exchange for relatively small monetary incentives.

In any case, as people are quick to give up some privacy for less hassle, regulations should avoid inadvertently sticking consumers with additional effort or a less smooth experience as they make privacy-protective choices, the study states.

What we say vs. what we want

“The big issue is that consumers say they want privacy, but if, for example, a firm introduced better privacy policies, would they actually get more customers? My observation is that, generally, the answer is no,” says Athey, who is a professor of the economics of technology at the Stanford Graduate School of Business and a consultant at Microsoft Corp. since 2007.

The traditional economic paradigm is that users have full information and they make informed choices. But that dynamic doesn’t hold if consumers do not take the time to really evaluate all the options, she says.

“Then the market provides weaker incentives for firms to really give consumers what they want,” Athey says.

Consumer laziness may play a role, but Athey says she also thinks consumers don’t feel they have “meaningful choices” when it comes to how service providers—ranging from social media and email to banking and retail—handle personal data.

For social media, users will gravitate to where their friends are, regardless of privacy policies, Athey explains. At the same time, major email programs all have fairly similar privacy policies, so it’s tough to differentiate them or to understand how much switching to a new provider would actually improve the situation.

And no matter what businesses do with consumer privacy settings, or even if they blunder and anger users by disclosing or losing too much personal information, it appears that consumers will usually stick with them.

Numbness kicks in, too. Having consumers repeatedly consent to legal privacy terms or confirm their acknowledgement of cookies just trains users to ignore them. In turn, such privacy notices probably have zero impact.

“And I don’t see firms offering consumers really great choices about how long they will retain your data,” Athey says. What would happen, say, if consumers had options among providers of how long their data is stored—10 years, two years, one year, or six months?

“We don’t have those kinds of meaningful choices, and the policies we have don’t provide firms any incentive to offer those meaningful choices,” she says.

Understanding consumer behavior

The study’s findings indicating the power of placement and navigation ease are consistent with consumer behaviors tech firms already know well.

“By and large, when you’re on a small screen, the information that is presented most conveniently is the information that you pay attention to,” Athey says.

That’s why it’s important how Facebook ranks its news stories, which apps are at the top of a mobile store, or which web link leads a search query.

“All of these technology intermediaries have a huge impact on what you read, what you consume, and what you buy just by how they present you information,” Athey says.

In light of what we know about consumer behavior, “the way privacy policies both here and in Europe have been designed is pretty ineffective,” she says.

“There’s a role for regulation here, clearly, in this area of privacy and security,” Athey says, “but even beyond telling companies what to do, just making it simpler for consumers to make meaningful choices.”

It could be easier on consumers, for example, if privacy policies called for some kind of report card based on expert audits that evaluate tech firm practices.

“Then you as a consumer can say, ‘I really like this product, so I’m willing to take a B on this privacy policy,'” Athey says.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Behavioural Sciences

Share:
The Big Picture
Explore and monitor how Behavioural Sciences is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

Billions of dollars have been invested in healthcare AI. But are we spending in the right places?

Jennifer Goldsack and Shauna Overgaard

November 14, 2024

Explainer: What is digital trust in the intelligent age?

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum