Equity, Diversity and Inclusion

Can robots slay sexism?

Pascale Fung
Director of the Centre for Artificial Intelligence Research (CAiRE) and Professor of Electrical & Computer Engineering, The Hong Kong University of Science and Technology

Robots can play a positive role for society – if we free them from reflecting our own biases right back at us.

I recently led a discussion panel on socially disruptive technologies at the World Economic Forum’s Annual Meeting of the New Champions in Dalian, China. Experts from academia and the industry exchanged ideas with audience members on how robots can do good, from improving health to protecting the environment.

As robots get better at understanding human emotional and social needs, there are more than a few ways in which artificial intelligence technology can help to boost women’s participation in the workforce.

Why men and women see different job ads online

Among the many criticisms of intelligent machines in general, one stood out in particular: the way Google’s recommendation algorithm serves up job adverts. Researchers from Carnegie Mellon University found that the algorithm was more likely to recommend high-prestige and high-paying jobs to men than to women.

The researchers used fake user identities with no search history which only differed in gender to search for jobs online, determining that Google displayed far fewer ads for high paying executive jobs to women than to men – 318 vs 1852. At the same time, a simple Google image search for “CEO” turns up images of men, not women.

For those of us who work on recommendation engines, this was a shock and a wake-up call. Recommendation engines, such as Amazon’s “users who bought this book also bought these other books”, or Google’s search listings, are based on aggregated group behaviour. In this case, machine learning has learned all the judgements and biases commonly made by human users. Like a bank that scrutinizes a user’s credit history and decides based on predictions of whether the user is able to repay the loan, recommendation engines look at user characteristics, including gender and ethnicity, and pushes for what the algorithms consider to be more suitable ads with a higher likelihood of being clicked.

There is, however, a simple way to defeat such “stereotyping” by machines. If you search for “women CEOs” and then search for executive positions, the algorithm will return more equitable results because your search history also shows diversity.

pascale-fung-robots (2)

Are algorithms inherently sexist?

That still seems to put the burden on the user to fight the sexism made by machines. So are artificial intelligence algorithms inherently sexist because they learn from both the good and the bad of human behaviour? Will machines be our friends or our foes in the fight for gender parity?

One way to defeat the discrimination of recommendation engines and to boost gender parity in the work force is to train the recommendation engines on a broader diversity of users and to enable the engines to do a better job of reading people’s intentions.

Instead of looking at age, gender, and a few other demographic points, the engines can be trained using more sophisticated machine learning methods to look at more detailed user profiles and online user data for a more personalized recommendation.

What’s more, artificial intelligence – in particular language processing – can help to illustrate unconscious gender bias from the way we use languages in the workplace. It can be used to show employee sentiments and a company’s corporate culture. It can even be designed to improve diversity in searching and hiring.

A “spellcheck” for sexist language?

Many organizations invest in boosting diversity, while reports reveal a gender dividend: organizations with women on the board or in senior executive positions have 20% better staff retention, employees who work harder by 12%, and profitability that increases by 47%.

However, how can we make sure that our investment has sound returns? Sentiment analysis, a technology that discerns how people feel from words used, can help highlight what employees really think of their workplace. If more women than men view leadership and promotion negatively, then the company will need to pay more attention to its diversity policies.

As reported in the New York Times, natural language statistics collected by Professor Benjamin Schmidt showed unconscious gender bias in 14 million online student reviews of their professors.

The statistics show that people give the benefit of doubt to men more than to women, and praise men for the same qualities that they criticize women for. Male professors are respected as “brilliant” while women professors with the same quality are deemed “bossy”, for example.

This is an attitude also prevalent in attitudes towards female leaders at work.  At the time of performance review or during a promotion process, it would be helpful to have a software that scans for gender bias language in reviewer letters and perhaps highlight them, much as a spell check or a grammar check highlighting potential lexical or syntactic errors. This is feasible by using sentiment analysis technology trained from the kind of statistics shown by Professor Schmidt.

When a robot conducts a job interview

A key step in hiring and promotion is the search process – sieving through candidate profiles and resumés to find the best match. Again, predictive analytics technology can help make this process fair.

A recommendation engine can be programmed to cast a wide net to look for potential talent globally, without prejudice against any particular group. Such an engine can look at the online presence of individuals, from their online profiles, their publications, their social network, to their presentations online, and use natural language processing technology to analyse their potential match for a given position.

Potential employers can incorporate a diversity weighting into the system. This entire first process can be made automatic.

Secondly, preliminary interviews are often conducted to find out about a candidate’s personality and leadership potential. There are many online tests and surveys for this purpose. Artificial intelligence technology can be used to enhance this. An AI-based virtual interviewer can be deployed that can ask candidates questions and gauge their responses from image recognition of facial expressions, gestures, and body language, as well as speech recognition of the tone of voice, and most importantly, natural language understanding of their answers to the interview questions.

One advantage of intelligent machines compared to human interviewers is that it is infinitely scalable.  Like a search engine that handles millions of search requests a day, a virtual interviewer can be accessible online and used by many candidates at the same time. Another advantage is that machines can be programmed to use gender-neutral language and avoid prejudicing the interview process. Algorithms can be trained on a pre-selected sample that is unbiased towards a particular gender.

I would argue that these kinds of intelligent systems should be used to assist human search committees – even if they are not replacing the latter. It would be good to compare the results of the machine “search” and the human search process, and provide feedback in an active learning process to improve both human and machine accuracy in identifying the right candidate.

When girls are educated, everyone benefits

It is well known that when girls are well educated, the entire society benefits. With the advent of online programs such as the Khan Academy, more and more girls in remote areas can have access to education. Women can also receive online job training that allows them to learn while taking maternity leave or years off to take care of their young children, if they so choose. Online training benefit both genders, but in a society where we often ask if women can “have it all”, it will ultimately help to bridge the gender gap.

Artificial intelligence can again improve online training in many ways. Natural language processing and automatic speech recognition can be used for learning many subject areas. Interactive programs or home robots enabled by these technologies can coach women from home.

Text writing software already exists that offer suggestions for better word choices when writing application letters. Going a step further, semantic parsing and sentiment analysis can be used to provide advice on essay writing for job applications.

Robotic cooks and baby-sitters

Women still do the bulk of the cleaning, cooking and child care at home. A study published last summer by the Economic and Social Research Council (ESRC) found that women in Britain still do at least two thirds of housework, even when they are the main breadwinner. Among women surveyed by India’s National Sample Survey Organization in 2011-12, around 39% in rural areas and about 50% in urban areas spent most of their time on domestic duties. While social attitudes need to shift to tackle this, technology can also help to free up women’s time.

Meals can be cooked in advance by a smart kitchen, which checks what’s in the refrigerator and orders more ingredients if needed. Robot vacuum cleaners can clean the house when nobody is home. Online tutoring system can interact with children to supervise their learning. Robot baby sitters could play a role keeping the little ones safe. Self-driving cars, when they are fully developed and equipped with adequate safety measures, can be used to ferry children to after school activities. To balance robotic child-care with a human touch, parents and guardians can communicate through these robots remotely with children by video.

It is also conceivable that a robot care giver can remind the elderly to take medicine on time, and keep them company by interacting with them. The benefit of artificial intelligence to health care in general is helping reduce the burden on many women to take care of the sick and the elderly by making people healthier and able to access better treatments.

Zara the Supergirl

At the beginning of this year, students and postdocs in my lab began pulling all of our various speech- and emotion-recognition modules together into a prototype empathetic machine we call Zara the Supergirl. She is a virtual robot for the moment, represented online by a cartoon character.

Can machines understand a person’s personality and other traits for leadership potential and help match the best candidate, regardless of gender, to a senior executive or board position? Can machine care-givers empathize with a patient’s pain and sense distress? Can machines have the kind of emotional intelligence required for these tasks? The objective of building Zara is to show the possibility of an automatic coach.

As can be seen in this video, Zara’s algorithms study images captured by the computer’s webcam to determine your gender and ethnicity. She will then guess which language you speak and ask you a few questions in your native tongue. What is your earliest memory? Tell me about your mother. How was your last vacation? Tell me a story with a woman, a dog, and a tree. Through this process, based on your facial expressions, the acoustic features of your voice, and the content of your responses, Zara will respond in ways that mimic empathy.

After five minutes of conversation, Zara will try to guess your personality among 16 personality types: you seem like a visionary, you need to learn to delegate more, etc.

Zara is a prototype, but because she is based on machine-learning algorithms, she will get “smarter” as she interacts with more people and gathers more data. It is entirely conceivable to train Zara to interview candidates for leadership positions and provide an unbiased screening. It is also potentially possible to train Zara to talk to patients and the elderly in an empathetic way.

From better recommendation algorithms to cyber supergirls, artificial intelligence technology has the potential to help us identify and bypass our own prejudices.

Author: Pascale Fung, Professor of Electronic and Computer Engineering, Hong Kong University of Science and Technology

Image: A visitor speaks to Baidu’s robot Xiaodu at the 2015 Baidu World Conference in Beijing, China, September 8, 2015. Xiaodu, an artificial intelligent robot developed by Baidu, has access to the company’s search engine database and can respond to voice commands, Baidu says.REUTERS/Kim Kyung-Hoon

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Gender Inequality

Related topics:
Equity, Diversity and InclusionEmerging Technologies
Share:
The Big Picture
Explore and monitor how Gender Inequality is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

3:02

How do we make the green transition fair for everyone?

Investing in a more age-inclusive workforce can help us navigate demographic shifts

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum