AI could make more work for us, instead of simplifying our lives
Concerns over a decline in productivity are a key motivation behind efforts to automate everyday work. Image: REUTERS/Temilade Adelaja
Barbara Ribeiro
Associate professor in innovation management and policy and Honorary Lecturer, SKEMA Business School and University of Manchester- Introducing automated processes can actually make work more complex and generate new tasks, according to a new study.
- The study was carried out in the field of synthetic biology, and the robotic platforms used created extra hypotheses and the need for more experiments.
- The same principle could apply to tools such as ChatGPT, with workers reportedly being hired to stop it producing offensive content.
There’s a common perception that artificial intelligence (AI) will help streamline our work. There are even fears that it could wipe out the need for some jobs altogether.
But in a study of science laboratories I carried out with three colleagues at the University of Manchester, the introduction of automated processes that aim to simplify work — and free people’s time — can also make that work more complex, generating new tasks that many workers might perceive as mundane.
In the study, published in Research Policy, we looked at the work of scientists in a field called synthetic biology, or synbio for short. Synbio is concerned with redesigning organisms to have new abilities. It is involved in growing meat in the lab, in new ways of producing fertilisers and in the discovery of new drugs.
Synbio experiments rely on advanced, robotic platforms to repetitively move a large number of samples. They also use machine learning to analyse the results of large-scale experiments.
These, in turn, generate large amounts of digital data. This process is known as “digitalisation”, where digital technologies are used to transform traditional methods and ways of working.
Some of the key objectives of automating and digitalising scientific processes are to scale up the science that can be done while saving researchers time to focus on what they would consider more “valuable” work.
Paradoxical result
However, in our study, scientists were not released from repetitive, manual or boring tasks as one might expect. Instead, the use of robotic platforms amplified and diversified the kinds of tasks researchers had to perform. There are several reasons for this.
Among them is the fact that the number of hypotheses (the scientific term for a testable explanation for some observed phenomenon) and experiments that needed to be performed increased. With automated methods, the possibilities are amplified.
Scientists said it allowed them to evaluate a greater number of hypotheses, along with the number of ways that scientists could make subtle changes to the experimental set-up. This had the effect of boosting the volume of data that needed checking, standardising and sharing.
Also, robots needed to be “trained” in performing experiments previously carried out manually. Humans, too, needed to develop new skills for preparing, repairing, and supervising robots. This was done to ensure there were no errors in the scientific process.
Scientific work is often judged on output such as peer-reviewed publications and grants. However, the time taken to clean, troubleshoot and supervise automated systems competes with the tasks traditionally rewarded in science. These less valued tasks may also be largely invisible — particularly because managers are the ones who would be unaware of mundane work due to not spending as much time in the lab.
The synbio scientists carrying out these responsibilities were not better paid or more autonomous than their managers. They also assessed their own workload as being higher than those above them in the job hierarchy.
Wider lessons
It’s possible these lessons might apply to other areas of work too. ChatGPT is an AI-powered chatbot that “learns” from information available on the web. When prompted by questions from online users, the chatbot offers answers that appear well-crafted and convincing.
How is the World Economic Forum ensuring the responsible use of technology?
According to Time magazine, in order for ChatGPT to avoid returning answers that were racist, sexist or offensive in other ways, workers in Kenya were hired to filter toxic content delivered by the bot.
There are many often invisible work practices needed for the development and maintenance of digital infrastructure. This phenomenon could be described as a “digitalisation paradox”. It challenges the assumption that everyone involved or affected by digitalisation becomes more productive or has more free time when parts of their workflow are automated.
Concerns over a decline in productivity are a key motivation behind organisational and political efforts to automate and digitalise everyday work. But we should not take promises of gains in productivity at face value.
Instead, we should challenge the ways we measure productivity by considering the invisible types of tasks humans can accomplish, beyond the more visible work that is usually rewarded.
We also need to consider how to design and manage these processes so that technology can more positively add to human capabilities.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Science
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Emerging TechnologiesSee all
Michele Mosca and Donna Dodson
December 20, 2024