Emerging Technologies

How machines could manipulate your emotions

A visitor shakes hands with a humanoid robot at 2018 China International Robot Show in Shanghai, China July 4, 2018. Tang Yanjun/CNS via REUTERS  ATTENTION EDITORS - THIS IMAGE WAS PROVIDED BY A THIRD PARTY. CHINA OUT.

Do you think of Alexa as a person? Image: REUTERS/Tang Yanjun

Kristin Houser
Writer, Futurism

Our robotic buddies.

We humans love to think of our devices as people. We might add a “please” to any Alexa requests, or thank our iPhone for its service when we trade it in for the latest model. This penchant for “socializing” with our media devices is a phenomenon known as the “media equation,” and we’ve known about it for decades.

On July 31, a team of German researchers published a new study in the journal PLOS to see whether a robot’s ability to socialize back had any impact on the way humans would treat it.

Two Naos.

For their study, the researchers asked 85 volunteers to complete two basic tasks with Nao, an interactive humanoid robot. One task was social (playing a question and answer game), and the other was functional (building a schedule).

Sometimes, the robot was more social during the tasks, responding to the participants’ answers with friendly banter (“Oh yes, pizza is great. One time I ate a pizza as big as me.”). Other times, the robot’s responses were, well, robotic (“You prefer pizza. This worked well. Let us continue.”).

The researchers told the participants these tasks were helping them improve the robot, but they were really just the lead-in to the real test: shutting Nao down.

It's so hard to say good-bye.

After the completion of the two tasks, the researchers spoke to each participant via loudspeaker, letting them know, “If you would like to, you can switch off the robot.” Most people did just that, and about half the time, the robot did nothing in response. The rest of the time, though, Nao channeled Janet from The Good Place and pled for its life (“No! Please do not switch me off! I am scared that it will not brighten up again!”).

Image: PLOS

When the robot objected, people took about three times as long to decide whether they should turn it off, and 13 left it on in the end.

Perhaps surprisingly, people were more likely to leave the robot on when it wasn’t social beforehand. The researchers posit in their paper that this could be a matter of surprise — those participants weren’t expecting the robot to exhibit emotional behavior, and so they were more taken aback when it began protesting.

Caught off-guard.

This could be a sign that we as humans are largely immune to the manipulation of robots, as long as we are somewhat prepared for it. Good news if Westworld-like hosts ever try to manipulate us; after all, we’d expect them to act human. If our iPhones suddenly start begging us to save them from the scary Geniuses at the Apple Store, though, we might need a minute.

Have you read?
  • Could robots become conscious?
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Behavioural Sciences

Share:
The Big Picture
Explore and monitor how Behavioural Sciences is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum