The rest of the world can't free ride on GDPR
Countries outside the EU must implement their own data protection laws rather than trust corporations to transfer policy changes driven by the GDPR. Image: REUTERS/Dylan Martinez
Bhaskar Chakravorti
Senior Associate Dean, The Fletcher School of Law and Diplomacy, Tufts UniversityThe digital industry is riding an important—and turbulent—wave of change right now. As Facebook and others grapple with tough questions about data privacy and security practices, trust in social platforms appears to be plummeting. Companies and analysts are scrambling to figure out how to make privacy rules clear, protect user data, and evolve the business models that made them successful in the first place.
A rising chorus of voices suggests that there is a ready-made solution to these pressures around data—and it has already been prepared by regulators in Europe. The EU’s upcoming General Data Protection Regulation (GDPR) will put in place the world’s most demanding set of rules on how user data can be collected and used. Many are expecting that when these regulations go into effect in May 2018, they will address many consumers’ concerns.
GDPR could become a model for the rest of the world, the argument goes, since many global companies serve users in the EU. Firms will have to adapt to these regulations anyway, and it could make business sense for them to implement these digital privacy principles worldwide. Facebook itself has hinted that it might extend some of the EU-driven protections worldwide; others may follow its lead. Either way, it’s tempting to think that the biggest challenges to privacy in the digital age would be addressed and that would be the end of the matter.
I disagree with this argument. The U.S. and other countries cannot free-ride on Europe’s policy decisions, just as consumers cannot rely on companies to “self-regulate.”
For one thing, support for regulation varies widely from country to country — and of course, within countries. Public opinion in some EU member states shows support for stringent rules, but that support is not always shared in other countries. For example, in response to a Pew survey question, 85% of Germans favored the more stringent European data privacy standards, while only 29% of American respondents felt the same. A Dell-EMC studyreveals significant differences in terms of willingness to trade privacy for services across countries and across different uses of digital applications; of 15 countries studied, Germans were most concerned about privacy and the Indians were the least.
In the U.S., the approach to digital privacy has been more piecemeal: in effect, it is predicated in part on the idea that the ability of companies to collect, analyze, sell and monetize user data with minimal restrictions is the basis for an innovative digital industry; new users are drawn by free services and companies make money from the data collected. The idea of protecting U.S. competitiveness and its position in technology development will likely be central to the lobbying that will surround any efforts to change laws or boost federal regulations. Plus, many of privacy laws are set by states and are quite different from place to place. California lawmakers, for example, have proposed legislation to establish a data-protection authority while other states may offer very little regulatory protection.
Overall, it is clear that the societal demands and willingness to “pay” by trading off privacy for other benefits varies significantly. Both consumers and companies will likely have to manage different rules for different markets, and different technologies.
Emerging markets are often overlooked in these conversations, and they bring up a host of different issues. Some of Facebook’s biggest markets are in the developing world, and Facebook is experiencing its fastest growth in Asia and Africa. Of the top 10 countries with the most Facebook users, only two are in the developed world. Those two nations, the U.S. and the U.K., collectively account for 13% of all Facebook users. The remaining eight account for 41% of all Facebook users. What’s more, of the top 10 cities with the largest number of active Facebook users as of July 2017, all are in the developing world.
Our research on digital trust around the world, reported earlier in HBR, has found that users in the developing world are more trusting of online content, and—combined with fewer sources of objective information or little access to a free press—more vulnerable to manipulation by false information.
In Myanmar, for instance, Facebook is the dominant internet site because of its Free Basics program, which lets mobile-phone users connect to a few selected internet sites, including Facebook, without paying extra fees or using up allotted data in their mobile plans. In 2014, Facebook had 2 million users in Myanmar; after Free Basics arrived in 2016, that number climbed to 30 million. Recent rumor campaigns inciting violence against the Rohingya minority ethnic group in Myanmar were spread in part on Facebook, sparking systematic persecution and violence.
Facebook-owned WhatsApp has been identified as a primary carrier of fake news and divisive rumors in India, where its users’ messages have been described as a “mix of off-color jokes, doctored TV [clips], wild rumors and other people’s opinions, mostly vile.” Kenya has identified 21 hate-mongering WhatsApp groups. Data from WhatsApp can be harvested for a variety of socially harmful purposes.
While the developing world should also be given the information safeguards that are likely to appear in the West, I believe governments in the developing world should be wary of regulations as extensive as GDPR. Such regulations would impose costs on the mostly small businesses that operate in these regions and there is a sense that imposing a heavy burden on fledgling local data industries could stifle the chance for those companies to grow and compete.
Facebook, for its part, is taking steps to ensure that 1.5 billion users who live mostly in developing nations will not be able to file complaints under the EU GDPR and are governed instead by U.S. privacy laws. Collectively, these factors raise the specter of a world balkanized by digital “safe zones” in the advanced nations digital “red zones” in the developing nations. Far from being a force for equalization and inclusion, digital technology penetration and the degree of data protection could become a new form of inequality.
So if GDPR isn’t the answer for companies outside of Europe, what about self-regulation? Some are hopeful that CEOs will put privacy protections in place out of a sense of social responsibility, something Zuckerberg himself has discussed. During Zuckerberg’s recent testimony before the U.S. Congress, this theme of “responsibility” was repeated—by lawmakers, by Zuckerberg, and by commentators. The challenge with leaving digital privacy up to the company’s responsibility and self-regulation, of course, is that the digital industry has been enormously successful precisely because it has collected and monetized data with few constraints. Facebook itself experienced a growth of 61% in its profits at the end of 2017, despite the fact that it was a challenging year. Its revenues were a handsome $12.7 billion for the last three months of 2017 with a profit of $4.26 billion.
To get a sense of how privacy restrictions could bite into the bottom line, consider that, according to analyses by Goldman Sachs, Facebook could “potentially see a negative impact of up to 7% from GDPR.” With this much at stake, self-regulation is not a tenable option as a means to ensure the full safeguards necessary for digital privacy. Since data is the currency that runs the core of the business models in the industry—and there aren’t any viable alternative business models in sight—a company’s own sense of “digital social responsibility” will be moderated by the negative economic impact of limiting its use of consumer data; companies can be expected to do just enough to keep consumers and political entities from revolting, while building goodwill through other means. If there is a demand among citizens for even greater digital privacy, it would have to come through forward-looking public policy, consumer activism and regulation.
It’s important to get this right, and not just free-ride on other countries’ regulations or hopes that CEOs will make their own rules. The development of new technologies, such as AI, may hang in the balance. Consider that Facebook, much like fellow digital giants Google, Amazon, and Apple, is betting on artificial intelligence (AI) as the next source of innovation and competitive advantage. The more constraints there are on data collection and processing, the slower is the ability to capitalize on advances in AI. This, too, would create an opportunity cost—revenue and market share losses in the future, especially in competition with the rising tech companies in China that have access to data from a vast local market with few data privacy concerns and rules and a more intrusive government. Policymakers will have to grapple whether increased regulation and rules about data protection will hinder future competitiveness and present a society with a crucial tradeoff in the services that AI and machine learning could create.
The bottom line is that neither European regulators nor Mark Zuckerberg alone will secure our digital futures around the world: ensuring privacy, transparency, and innovation takes work. There are no short cuts. Regulators, consumer advocates, and technology policymakers will have to do the hard work of developing an independent vision that offers checks and balances. Governments will need the political will to institute regulations that strike a balance between local realities and global competitiveness. The EU regulations and Facebook’s serial stumbles and apologies can, at best, be a good place to start this essential conversation. But it has a long way to go.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Data Science
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Geographies in DepthSee all
Naoko Tochibayashi and Mizuho Ota
December 23, 2024