English language pushes everyone – even AI chatbots – to improve by adding
An international research team has looked at AI's linguistic biases. Image: Unsplash/Mojahid Mottakin
- AI chatbots are biased towards words that suggest adding rather than taking away, in line with long-standing linguistic trends, a new study finds.
- Addition-related words are more frequent and more positive in ‘improvement’ contexts rather than subtraction-related words, the academics say.
- GPT-3 told the researchers that: 'Adding something to something else usually makes it better. For example, if you add sugar to your coffee, it will probably taste better.'
A linguistic bias in the English language that leads us to ‘improve’ things by adding to them, rather than taking away, is so common that it is even ingrained in AI chatbots, a new study reveals.
Language related to the concept of ‘improvement’ is more closely aligned with addition, rather than subtraction. This can lead us to make decisions which can overcomplicate things we are trying to make better.
The study is published in Cognitive Science, by an international research team from the Universities of Birmingham, Glasgow, Potsdam, and Northumbria University.
Dr Bodo Winter, Associate Professor in Cognitive Linguistics at the University of Birmingham said: “Our study builds on existing research which has shown that when people seek to make improvements, they generally add things.
“We found that the same bias is deeply embedded in the English language. For example, the word ‘improve’ is closer in meaning to words like ‘add’ and ‘increase’ than to ‘subtract’ and ‘decrease’, so when somebody at a meeting says, ‘Does anybody have ideas for how we could improve this?,’ it will already, implicitly, contain a call for improving by adding rather than improving by subtracting.”
The research also finds that other verbs of change like ‘to change’, ‘to modify’, ‘to revise’ or ‘to enhance’ behave in a similar way, and if this linguistic addition bias is left unchecked, it can make things worse, rather than improve them. For example, improving by adding rather than subtracting can make bureaucracy become excessive.
This bias works in reverse as well. Addition-related words are more frequent and more positive in ‘improvement’ contexts rather than subtraction-related words, meaning this addition bias is found at multiple levels of English language structure and use.
The bias is so ingrained that even AI chatbots have it built in. The researchers asked GPT-3, the predecessor of ChatGPT, what it thought of the word ‘add’. It replied: “The word ‘add’ is a positive word. Adding something to something else usually makes it better. For example, if you add sugar to your coffee, it will probably taste better. If you add a new friend to your life, you will probably be happier.”
How is the World Economic Forum ensuring the responsible use of technology?
Dr Winter concludes: “The positive addition bias in the English language is something we should all be aware of. It can influence our decisions and mean we are pre-disposed to add more layers, more levels, more things when in fact we might actually benefit from removing or simplifying.
“Maybe next time we are asked at work, or in life, to come up with suggestions on how to make improvements, we should take a second to consider our choices for a bit longer.”
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Technological Transformation
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.