Emerging Technologies

Do we judge robots on their colour? This study says we do

Humanoid robot Justin at the German Aerospace Center (DLR) in Wessling near Munich, Germany, October 4, 2018. Picture taken October 4, 2018.  REUTERS/Andreas Gebert - RC166E0B8FF0

Unarmed black robots are seen as more aggressive than unarmed white robots. Image: REUTERS/Andreas Gebert

Sean Fleming
Senior Writer, Forum Agenda

Picture a robot. What colour are you imagining? The chances are it’s white or metallic, because robots of other colours are few and far between, according to a new study.

That got the researchers from New Zealand’s University of Canterbury thinking: is there such a thing as race bias where robots are concerned? And, if there is, what are the implications?

Associate Professor Christoph Bartneck and his team drafted in 163 people to participate in the study.

To establish if people attribute something akin to racial qualities to robots and what effect that might have, they used what’s known as the shooter bias paradigm, along with a series of in-depth questions.

Threat perception

Shooter bias is a concept rooted in studies to determine if police officers are more likely to shoot people who aren’t white, regardless of any actual threat to life.

Bartneck and his team showed the participants a series of images: human and robot, armed and unarmed, black and white. If they perceived a threat, participants pushed a button that substituted for pulling the trigger of a gun.

The researchers looked at whether participants correctly identified aggressors from non-aggressors and their reaction time – and if they chose to shoot, the amount of time between seeing the image and hitting the button.

Researchers used different robots to determine when people perceived a threat. Image: University of Canterbury

The results, published in the report Robots and Racism, show that unarmed black robots were shot more often than unarmed white ones. They were shot more quickly, too, indicating an instinctive reaction rather than an assessment of the visual information.

“This bias is both a clear indication of racism towards black people, as well as the automaticity of its extension to robots racialized as black,” say the report’s authors.

Developing diversity

Of course, robots cannot be said to have race in the way the word is applied to people. But that’s a distinction seemingly at odds with people’s perceptions.

Participants in the study were “able to easily and confidently identify the race of robots according to their racialization,” the report states.

If racial characteristics will be applied by people to robots, unwittingly or not, the development of diversity in robots will become important, they argue.

For anyone working in robot development, they say, these findings ought to inform the decisions they make around appearance: “If robots are supposed to function as teachers, friends, or carers, for instance, then it will be a serious problem if all of these roles are only ever occupied by robots that are racialized as white.”

Have you read?
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Digital Communications

Share:
The Big Picture
Explore and monitor how Digital Communications is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum