Opinion

Congratulations, it’s a girl: Exploring sexism in artificial intelligence

The Works' Tomas Haffenden considers what the rise of female voice activated assistants mean for the future, and what the 'sea of dudes' have to do with it all.

I have just had a baby girl. I mean it is probably worth noting my wife played some part in her gestation and delivery, but as a modern progressive couple I’ll assume a minimum of 50 percent of the credit.

Her arrival has made me consider what the world holds in store for this little female version of me. As I bark at Siri, holding my daughter in the dark, for a “how to” video on baby swaddling, I suddenly feel unsettled.

As it becomes second nature to bark orders at the ‘person in our pocket’, does it matter that this person seems to be a she? Over 710 million people regularly use an AI assistant and this is set to more than double by 2021. Should I be celebrating the female position as oracle, or furious that my daughter has been assigned the role of assistant in the world’s future?

It could be argued that assigning gender to an AI assistant is a moot point. When asked, Siri, Alexa, Cortana and Google Home, all respond with claims of gender neutrality.

This claim might be easier to accept if they didn’t all have a female voice as default and, with one exception, names whose origins are hardly genderless – Cortana is named after a semi-nude female character from the video game Halo and Alexa was selected in favour of Alex, as a homage to the Great Library of Alexandra. Sexy librarian anyone?

Halo’s Cortana

It is perhaps little wonder that no one seems to have considered if the use of a female voice could have any downsides. With women in computing roles steadily declining since a peak in 1991, women now represent less than 25 percent of the workforce. Evidently, the Adams far outweigh the Eves in the creation and development of this technology.

Microsoft researcher Margaret Mitchell refers to this imbalance as the ‘sea of dudes’. Is it possible that they are guilty of trying to realise their Weird Science, Her and Ex Machina fantasies or is there another explanation for the female AI assistants?

Numerous studies, including one by Karl MacDorman, associate professor at Indiana University, assert that both men and women report an overwhelming preference for a female voice. With such strong evidence suggesting a mutual preference, what risks are there in reflecting this in our AI assistants? Well, as this technology integrates itself into our pockets and our homes, will children everywhere have a new female role model? A servant who does exactly as she is told.

“Please” and “thank you” are unnecessary, when “Alexa, Stop!” will do just fine, thank you very much. As my little one grows up interacting with such technology, is there a chance that a female voice is the one she associates with subservience?

Why not use male and female voices? Clifford Nass in his seminal work ‘Wired for speech’ suggests voice preference is in fact more subjective than MacDorman would have us believe. He found participants preferred a male voice when learning and female when receiving life advice. It is hard to ignore the inherent stereotyping in these findings but let’s focus on the idea that the task requested should play a key part in which voice is most effective.

In fact, an AI assistant’s awareness of which voice is most appropriate might be closer than you think. Amazon have recently announced research into voice recognition to detect mood and even the kind of words a person uses. Will we soon be in a place where Alexa selects her voice to suit our mood and the message being delivered? Maybe we can rely on big data, as opposed to the sea of dudes to best decide the gender of our AI assistants, I mean what could possibly go wrong?

It turns out quite a lot. Microsoft’s Twitter bot Tay was designed to learn from fellow Twitter users and create its own Tweets based on its interactions. Users, instead of spreading love and acceptance subverted their interactions, teaching Tay to curate content that was so racist and abusive it was taken offline after only 16 hours.

So where do we leave this? It seems that we are trapped. Our AI assistants are gathering data, tainted by our historic flaws. As our technology begins to learn from us we must also consider how the next generation will learn from it. As marketers, developers and parents we need to decide what role we a going to play, not just in how this technology sounds, but what parts of our imperfect world we are willing for it to reflect back at us. In the meantime, at least at our house, the bedtime story will be Javascript for Babies and we’ll be saying “thank you” as we ask Alexa to dim the lights.

Tomas Haffenden, is a senior digital producer at The Works.

ADVERTISEMENT

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing