Skillsoft Blog

Why Giving AI Assistants a Female Name is a Warning Sign We Must Not Ignore

Why Giving AI Assistants a Female Name is a Warning Sign We Must Not Ignore

Have you ever wondered why Alexa was the name chosen by Amazon?

Why not a genderless name like Alex or Ali?

All four of the major AI assistants—Alexa, Apple’s Siri, Google Assistant, and Microsoft’s Cortana—speak by default with a female voice.

Why, when naming the ‘server,’ did the creators give each one a female name?

Some in the industry claim it has to do consumer preference that both genders respond better to a female-sounding voice over a male one. I disagree. I can’t help wondering if that has more to do with cultural norms and that it reinforces pre-existing gender stereotyping and unconscious bias.

Historically, telephone operators, cashiers and secretaries were predominantly female. And therefore it follows that when designing a voice for the machine that will operate these tasks in the future, we’d choose a female one; a sound that many perceive as soothing, subservient, compliant, passive and agreeable.

And there are those who’ll argue that women’s voices are easier not just listen to, but also to understand and hear. All of which is untrue. They are myths, as Sarah Zhang calls them.

She talks about the reason why these myths persist:

“An oft-cited reason for Siri’s femaleness is the persistence of history. The first voice navigation systems to become widely used were in the cockpits of WWII fighter planes, where female voices supposedly stood out against the low rumble of engines. More recently, though, a 1998 study at the Wright-Patterson Air Force Base in Ohio found the opposite: It’s actually female voices that are less intelligible against the noise inside cockpits, though the difference was tiny and only statistically significant at the highest levels of noise.”

I know you can change Siri’s voice (and we did in our house to an Australian male) but people are inherently lazy and they tend not to change things.  The fact that this is the default on over a billion Apple devices is a problem. And it’s worse for Amazon – you cannot even change Alexa’s gender – only her accent.

It simply boils down to the fact that it has more to do with gender and perceived roles genders play in society. And it’s part of a larger problem within the tech world.

 Does it matter?

Yes.

The people behind the majority of today’s technological advances — the workers creating the algorithms — are predominantly white and male. And even more importantly, when white male coders assemble data for chatbots, machines are likely to perpetuate inequities found in the real world. They are prone to hard code their own subconscious bias about race, gender and class into algorithms that are designed to mirror human decision making. This has the propensity to amplify existing stereotypes and create a stronger association for male and female-oriented images, behaviors and careers.

Machines learn from masses of data. If that data has gender biases incorporated, it will become part of the algorithm. For example, researchers at Boston University and Microsoft asked the machine learning software to complete the statement “Man is to computer programmer as woman is to …” It replied, “homemaker.” Ugh!

In more depressing news, Wired reported that machines were learning to associate images of kitchens with women. The article stated that research-image collections display a “predictable gender bias in their depiction of activities such as cooking and sports. Images of shopping and washing are linked to women, for example, while coaching and shooting are tied to men.”

Ivana Bartoletti, chair of the Fabian Women’s Network, wrote an excellent article for the Guardian in which she gave more examples of this bias that you can look at right now. Search Google for “unprofessional hairstyles at work.” You are served up a slew of black women with natural hair. Now, search for “professional hairstyles for work” and, you guessed it, it is all coiffed white women! Why is natural hair on black women deemed to be unprofessional whereas natural hair on white women is considered professional?

If you want further proof, try this – search for “women” on Google – and you get three pages of images of white young women before you come to any other racial or ethnic representations!

In the excellent book The Man Who Lied to His Laptop, Clifford Nass reports how BMW was forced to recall one of its cars because male drivers in Germany didn’t trust the female voice offering directions from the car’s navigation system. In Japan, a call centre operated by Fidelity would rely on an automated female voice to give stock quotes but would transfer customers to an automated male voice for transactions.

And this reinforcement of gender clichés can result in women getting targeted unequally for financial loans, medical services, hiring and political campaigns. Such is the danger of the current gender imbalance that, as Erika Hayasaki points out in a report by the National Science and Technology Council, the shortage of women and minorities is “one of the most critical and high-priority challenges for computer science and AI.”

Women in STEM

Of course, these issues are driven and further supported by the lack of women who are working on coding the future. Andrea Keay, director of Silicon Valley Robotics, an industry group that supports the innovation and commercialization of robotics technologies, aptly sums up my concern with such imbalance:

“Inherently having only a section of the population involved in the practice of AI means that we are missing out on a range of inputs and insights. And we are also seeing that AI is, by design, susceptible to learning stereotypes, and then perpetuating them. When things happen and we don’t see a person involved, we are less likely to see that the process may be biased, or wrong. And it’s much harder for us to know how to take action against an algorithm.”

We all know there are very few women, and even fewer women of color, working in STEM:

How do we fix it?

Coders are smart people (most of them). The lack of women in STEM is not some secret plan to ensure the patriarch continues unabated into the fourth industrial revolution. It is happening because people are unaware of how their conscious and subconscious biases are influencing artificial intelligence. Eventually, we will be able to train AI to recognize and self-correct any inherent biases from the author. But we are a long way from this reality.

We need to train coders to recognize their own biases, to improve the AI experience for everyone.  We need to get more women in STEM.  Only 6.7% of women are pursuing STEM careers and only 25% hold STEM jobs. A perception remains that STEM is male-dominated and super techy. We need to get better at explaining why AI is an excellent career for women, and work hard to attract them into the industry.  But that’s a whole other blog post!

Fortunately, some of the work on this has already begun.

Women in Machine Learning is on a mission to increase the number of women in machine learning, help such women succeed professionally and increase the impact of women in the machine learning community. One of the founders, Hanna Wallach is an advocate for “fairness, accountability and transparency” in machine learning. AI4ALL is a non-profit working to increase diversity and inclusion in acritical intelligence. They create pipelines for underrepresented talent through education and mentorship programs in high schools around the US and Canada. Finally, Women in AI is a global organization of women experts in the field of artificial intelligence who run workshops, networking events and conduct research with a goal of changing the lack of diversity in AI.

 

Tara O’Sullivan is the Chief  Marketing Officer and Executive sponsor of the Women in Action Programme at Skillsoft.

 

 

 

Post a comment

Comments are moderated, and will not appear until the author has approved them.

(URLs automatically linked.)


Your Information

(Name and email address are required. Email address will not be displayed with the comment.)