Amazon - Remove Gender Stereotypes from Alexa to #UngenderAI

0 have signed. Let’s get to 1,000!

At 1,000 signatures, this petition is more likely to be featured in recommendations!

Until a few years ago, people didn't need 24*7 internet. We were happy with a few minutes spent in checking the 2 emails we got from random people telling us that we'd won lotteries. I myself survived on 50mb data per month. Now I use 1.5 gb data per day (and on some days, even more). How and why? A certain company gave me "that" choice. I didn't know it could be an option. But they made it known to me that it's an option. And that's how I started to prefer it. And that's how all the other companies started mimicking "the trend". And that's why I now get scolded by my parents for being online 24*7!

So, a company systematically established a certain behaviour in me, without me even realising that.

Similarly, when companies tell you that they (people) "preferred" it when asked "why they (companies) chose a female voice for Alexa", please bother to ask them and yourselves, who gave them (or us) those preferences? Companies. When testing their products, companies get to decide what choices the customers will get to pick from. That's where the bias begins. Then, people pick from those choices. How do people choose or prefer? That too stems from our psychology. Another bias. So, don't blame the customers "exclusively and particularly" for choosing a female voice for Alexa. Bias is an inescapable human trait but does it also have to pervade our technology now?Agreed that machines were built to simulate the human mind but should they also simulate our stereotypes and biases?

Ask the right question — why must all the virtual assistants have female voices, female names, and female mannerisms by default? Why was gendering technology so important? Since when have technology, innovation, and science started being labelled under a particular gender? Does science even have a gender? Should science have a gender?

Alexa named after the library of Alexandria, could have been Alex. Apple’s Siri translates to “a beautiful woman who leads you to victory” in Old Norse. Microsoft's Cortana’s namesake is a fictional synthetic intelligence character in the Halo video-game series, with no physical form, but projects a holographic version of "herself".

Hence, our “21st century digital” universe is fraught with “17th century social” problems — although they lack bodies, they (VAs) embody what we think of when we picture a personal assistant: a competent, efficient, deferential, and meek woman who does all the big and small tasks ordered by one but she is always polite and obedient and hence, it’s no accident that Siri, Cortana, and Alexa, all have female voices, female names, and female mannerisms “by default”. People are conditioned to expect women, not men, to be in “assistance” roles—and that the makers of digital assistants are influenced by these social expectations which is why AI technology adapted to pre-existing stereotypes, and helped to perpetuate them. So, the gendering of AI became inevitable. Sign my petition to help change this. 

There are a few assistants that have a male voice as an option. But, as one of the widely used platforms, I am asking Amazon to lead the way with Alexa by stopping the reinforcement of problematic gender stereotypes in AI by:

  1. Ending the regressive practice of making voice assistants female by default 
  2. Developing a genderless voice assistant (like Q)
  3. Imparting regular gender-sensitive training to AI developers within the company

UNESCO’s 2019 report titled “I’d Blush If I Could” mentioned that “their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.” The title of the report is, in fact, a reference to the standard answer given by the default female-voice of Apple’s Siri, in response to insults from users. 

When we confidently imagine AI as a subservient woman, we reinforce dangerous and outdated stereotypes of women into not just our imagination but also our science, innovation, and technology.

We must not chain ourselves to the ‘shoulds’ of gender-based preconceptions and misconceptions. But here we are, busy gendering technology and fighting to justify why it’s important for technology to have a gender. We haven’t fixed sexism or racism or any other -ism in humanity yet, but here we are halfway through coding them into technology. Technology is, and should be meant to benefit all members of society, regardless of their age, gender, religion or status in society, rather than replicate human biases, perpetuate disparities or widen the gap between the haves and have nots. 

So, it’s high time that we shoulder our responsibility of creating and inspiring technology that is not only effective but also ungendered enough to break existing gender stereotypes and discriminatory social norms.

Sign my petition asking Amazon to lead the way with Alexa by removing these gender stereotypes. Let’s #UngenderAI together.