Fourth Industrial Revolution

Should we be worried about a world ruled by algorithms?

A hand is silhouetted in front of a computer screen in this picture illustration taken in Berlin May 21, 2013. The Financial Times' website and Twitter feeds were hacked May 17, 2013, renewing questions about whether the popular social media service has done enough to tighten security as cyber-attacks on the news media intensify. The attack is the latest in which hackers commandeered the Twitter account of a prominent news organization to push their agenda. Twitter's 200 million users worldwide send out more than 400 million tweets a day, making it a potent distributor of news. REUTERS/Pawel Kopczynski   (GERMANY - Tags: CRIME LAW SCIENCE TECHNOLOGY) - RTXZUYD

Algorithms are holding greater influence over our lives than ever before. Image: REUTERS/Pawel Kopczynski

Alan Reid
Senior Lecturer in Law, Sheffield Hallam University

In political terms, 2016 has been a year of uncertainty. Yet, it has also seen the rising dominance of algorithms, complex mathematical calculations that follow a pre-set pattern and are increasingly used in technology designed to predict, control and alter human behaviour.

Algorithms try to use the past as an indicator of the future. As such, they are neutral. They do not have prejudices and are unemotional. But algorithms can be programmed to be biased or unintentional bias can creep into the system. They also allow large corporations to make largely hidden decisions about how they treat consumers and their employees. And they allow government organisations to decide how to distribute services and even justice.

The danger of algorithms being used unfairly or even illegally has led to recent calls by the UK Labour party for greater regulation not just of tech firms but of the algorithms themselves. But what would tighter rules on algorithms actually cover? Is it even possible to regulate such a complex area of technology?

Algorithms are used by governments and corporations alike to try and foresee the future and inform decision making. Google, for example, uses algorithms to auto-fill its search box as you type into it and to rank the websites it lists after you hit the return button, directing you to certain websites over others. Self-driving cars use algorithms to decide their route and speed, and potentially even whom to run over in an emergency situation.

Financial corporations use algorithms to assess your risk profile, to determine whether they should give you a loan, credit card or insurance. If you are lucky enough to be offered one of their products, they will then work out how much you should pay for that product. Employers do the same to select the best candidates for the job and to assess their workers’ productivity and abilities.

Even governments around the world are becoming big adopters of algorithms. Predictive policing algorithms allow the police to focus limited resources on crime hotspots. Border security officials use algorithms to determine who should be on a no-fly list. Judges could soon use algorithms to determine the re-offending risk of an offender and select the most appropriate sentence.

Given this extensive influence algorithms now have over our lives, it’s not surprising that politicians would like to bring them under greater control. But algorithms are usually commercially sensitive and highly lucrative. Corporations and government organisations will want to keep the exact terms of how their algorithms work a secret. They may be protected by intellectual property rights such as patents and confidentiality agreements. So the ability to regulate the actual algorithms themselves will be extremely difficult to achieve.

This hidden nature of algorithms might itself be a fruitful source of regulation. The law could be amended to force all companies and government agencies to more widely publicise the fact that decision making in the organisation will be taken by way of an algorithm. But such an approach would only serve to improve transparency. It would do nothing to regulate the actual algorithmic process. So the focus on regulation would need to shift to the inputs and the outputs of the algorithm.

In the UK, the current law of judicial review would be enough to cover the inputting of data into algorithms by governmental bodies. Judicial review allows judges to assess the legality of decisions taken by public bodies. So judges could determine whether the data inputted into the algorithm was correct, relevant and reasonable. The ultimate decision taken by the public body based on the output given by the algorithm would also be subject to judicial review, asking whether the final decision was proportionate, lawful and reasonable.

For private corporations, the picture is more mixed. The need for regulation varies according to the potential impacts on the individual. Algorithms that select what music, videos and TV shows you might want to enjoy arguably need little or no regulation. But if companies raise prices or refuse you their services based on your income bracket, job title or social status, then the need for regulation is much more pressing.

Politicians empowering regulators to micro-manage ever-changing algorithms in real-time would be unworkable. Instead, companies must be able to use their own algorithms as they see fit, with accountability for their misuse coming after the event.

Data protection

All of these scenarios use personal data in order to function so the simplest way to deal with problems with the algorithm would actually be through data protection law. Current UK law requires that people are able to object to automated decision making if such decisions have a significant impact on them. So people can specifically ask for the automatic rejection decision to be reviewed and re-run with a human operator. For situations involving individualised pricing, existing consumer protection laws and competition laws can be used to control the behaviour of corporations using algorithms.

However, the public at large remain generally unaware of these legal methods to control corporate activities. Greater knowledge and awareness would empower citizens and consumers alike.

Algorithms are only going to become a more important part of our lives as technology develops. This future will be bright if we have increased general transparency and public awareness of the ubiquity of algorithms. Companies and governments will have no option but to improve their software. Even easier-to-use appeal processes under data protection, consumer protection and competition laws will also help.

But a new, over-arching uber-regulator would be excessively costly, unwieldy and of limited impact. In the post-truth world, politicians should be extremely wary of over-promising and under-delivering.

Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Media, Entertainment and Sport

Related topics:
Fourth Industrial RevolutionIndustries in Depth
Share:
The Big Picture
Explore and monitor how Fourth Industrial Revolution is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

What companies do now will determine their future in the Intelligent Age

Mihir Shukla

December 23, 2024

The rise of gender-inclusive agritech and why it matters

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum