Financial and Monetary Systems

Does big data challenge discrimination laws?

Mark Burdon
Lecturer, The University of Queensland
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Financial and Monetary Systems?
The Big Picture
Explore and monitor how Innovation is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Innovation

Staff recruitment and retention are an ongoing challenge for employers. Proponents of big data in the workplace are now claiming they can change that.

We’re entering a new age of predictive selection that is producing very surprising results. However, these new forms of decision-making and the results they produce could challenge the very basis of our anti-discrimination and privacy laws.

So what’s new?

At some point, we’ve all had to apply for a job. We complete an application, with a few positive embellishments here and there. We then submit it. And, fingers crossed, we get an interview.

We then get checked out to see if we’re a good fit. If we are, we get the job. It’s a simple and time-honoured practice.

We have a good idea why we got the job. We went to the “right” university. We got a good GPA. We demonstrated we could be a safe pair of hands.

We also know that our employer would not make a decision to hire us based on our race, gender or sexual preference. That’s against the law.

So what if we didn’t get selected on our GPA? Or because of the “right” university we went to? What if we didn’t even get selected on the basis of our job application? What if the deciding factor for our selection was not our application, but the browser we used to upload our application?

Welcome to the potentially confounding new world of big data in the workplace. It’s a world that turns upside down the traditional process of staff selection, one in which algorithmically, data-driven decisions are the norm. This is a world that demands the collection of more and more types of information, including the metadata from our working activities, our social media content and anything else that might be remotely relevant (or even irrelevant for that matter).

None of this is entirely new. Michael Lewis’s book Moneyball demonstrated the value of Billy Beane’s data-driven quest and its dramatic effect on the baseball team he was managing.

However, Beane’s data-driven quest is now being replaced by a predictive quest and it’s producing some rather unusual results. Consider these examples.

A candidate who is creative but not overly inquisitive and is a member of one but no more than four social networks is more likely to be hired as a customer-care representative by Xerox. Especially if they live close to the office and have access to reliable transport.

Software programmers who have never written open-source code are being recruited for open-source programming positions if they have the right online profile and an interest in Japanese manga websites.

Even a certain combination of words in a tweet or a LinkedIn post can now become a reliable indicator for a good software programming candidate.

So what’s the problem?

These new decision-making processes and the results they generate could potentially cause significant problems for our legal frameworks of anti-discrimination and information privacy law. Both of these laws were designed in the 1960s and 1970s and may not be well suited to deal with the challenges of big data in the workplace.

Let’s start with anti-discrimination law. Anti-discrimination laws prohibit discrimination on the basis of certain social and physical attributes: race, gender, sexual orientation, disability. It seems obvious to us now that staff should not be selected or rejected on these attributes. We instinctively know that it is wrong to make decisions on that basis.

The problem with big data in the workplace is that it is often impossible to connect discrimination to the inequalities that flow from data analytics. Decisions on employee selection are being made on a range of attributes that are simply unintuitive: the browser we used to upload our application; our like of Japanese manga cartoons; even the words we use in a tweet. Big data techniques therefore recast the very notion of workplace discrimination.

Establishing a link between a protected attribute and a big data discriminatory practice is likely to be evidentially insurmountable. While a prohibited attribute might be a factor in the predictive process, proving the existence of a discriminatory factor and how this factor was considered will almost be impossible in predictive decision-making processes. These processes involve millions of decisions using complicated algorithmic calculations, which the worker is not given access to or even informed is occurring.

Big data also provides significant challenges for information privacy law. The underlying logic of big data is to collect everything and keep it forever. The search for the unintuitive requires nothing less. All information, at all times, could potentially be relevant.

All information therefore has the capacity to be personal information as all information could be used to identify an individual. However, information privacy law was never designed to consider that all information should be classed as personal information and protected.

Big data in the workplace could significantly challenge our existing legal frameworks of anti-discrimination and information privacy law. The real danger is the slippery slope of acceptance creep where we simply accept without question the veracity of a correlated prediction. We then run the risk of creating a new form of discrimination, info-structural discrimination, in which patterns of discrimination and bias are embedded in information infrastructures.

At that point, the dangers of big data are likely to outweigh its benefits and could negatively impact on all of us. We could find ourselves fruitlessly searching for the unintuitive attribute that will give us that job we are after, but which we can never control.

Published in collaboration with The Conversation

Author: Mark Burdon is a Lecturer at The University of Queensland. Paul Harpur is a Lecturer in the School of Law at The University of Queensland

Image: A man types on a computer keyboard in Warsaw in this February 28, 2013. REUTERS/Kacper Pempel.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Financial and Monetary SystemsJobs and the Future of Work
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

China's central bank cuts rates, and other economics stories to read

Joe Myers

July 26, 2024

About Us

Events

Media

Partners & Members

  • Sign in
  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum