Tuesday, April 16, 2024

Alexa, why aren’t you a dude? How female digital assistants reinforce stereotypes

Share

We find ourselves surrounded by helpful assistants these days, from Apple’s Siri to Microsoft’s futuristic Cortana to Amazon’s Alexa to … whatever the designation for “OK Google” is — and every one of them defaults to a female persona. In response, a lot of writers are asking if the tendency of postmodern digital assistants to skew female is a dangerous thing for society.

Sure, you can change Siri to a different gender and even a different accent — I was carrying around an Australian bloke for a while before switching to a Brit, preferring a digital Alfred Pennyworth — but digital assistants skew female and stay that way. It’s a perception that some think might be having a negative effect on society in training everyone to think of women as assistants, rather than as whole human beings and equals.

More: “Human, the milk has spoiled.” Soon Alexa will run your smart home, and your life

There are other factors at play as well. Some researchers believe we as human beings have a natural inclination to prefer a female voice. Historians and sociologists often point to history as a touchstone, hypothesizing that the preponderance of female voices in navigation devices dates back to World War II. Still others draw a direct line from the portrayals of women in Mad Men, looking back on a time when women were often secretaries and telephone operators. Just because you can find a historical precursor doesn’t make it okay to perpetuate a stereotype.

There’s even a school of thought that thinks the dominance of female voices is due to the instances of male robotic menace in popular media: think HAL 9000 from 2001: A Space Odyssey or W.O.P.R. from Wargames. (Some might argue that GLaDOS from the videogame Portal is way scarier than any movie robot.) This one is also easy to dismiss, thanks to the strange, fascinating portrayals of female A.I.s in the films Hal and Ex Machina.

The linguistic engineers at Google and Apple must face a lot of hurdles; women and men not only sound different when they speak, but they also use different words. This can create a linguistic version of digital design’s “uncanny valley”: an artificial voice that sounds female, but uses male-sounding phrases, won’t seem authentic to the human speaking to it. This leads to an exchange where the user is focused more on the sound of the voice of the digital assistant than the information being relayed.

Nevertheless, digital assistants are certainly based on millions of dollars in market research, and the Silicon Valley giants who funded that research aren’t releasing statistics anytime soon. It’s certainly purposeful in design, as evidenced by the fact that A.I.s like Siri have built-in responses to resist gender identification. If you ask Siri what gender it is, the response is generally, “I am genderless, like cacti, or certain species of fish,” or a variation.

Women and men not only sound different when they speak, but they also use different words

Conversely, both Apple and Google have both stated a desire to make their digital assistants more sophisticated, giving users a sense of a relationship rather than a device. It’s a potentially troublesome phenomenon as the makers of anthropomorphic assistants to accent non-threatening and subservient qualities to achieve social acceptance. Scarier still is the idea that digital assistants are not only reflecting gender bias, but causing it. Kids are already anthropomorphizing their robot friends, and also bossing them around — a behavior parents don’t want them to extend to actual people.

Killer robot expert, Daniel H. Wilson, a roboticist and the author of Roboapocalypse and How to Survive a Robot Uprising, agrees with the flood of responses that urge caution as artificial intelligence gets more and more sophisticated.

More: The creators of Siri are now on a mission to destroy it

“The preponderance of female virtual assistants is proof that robots can be a reflection of human stereotypes,” he told Digital Trends. “As we continue to create technology that stands in for people, it’s crucial that designers work to avoid perpetuating human prejudice through their creations.”


Bill Roberson/Digital Trends

Gender bias isn’t a new phenomenon and it shows up in surprising ways — it’s the reason your kid couldn’t buy a Rey action figure when The Force Awakens came out, or why Tony Stark replaces his trusty A.I. Jarvis with a “Girl Friday” — but it is something that A.I. developers should consider as they continue tweaking their digital assistants. Dissenting voices, writers like Jessica Nordell and Soraya Chemaly, are asking the right questions.

“Many people dismiss issues like these, which are fundamentally about representation and its impact on self-image, ambition, and human potential, as inconsequential, but they are mistaken,” writes Chemaly at Role Reboot. “Naming and designing products in these ways is both a symptom of bias and a cause, and steps should be taken in the design of new technology to understand how social inequalities are related to technical ones.”

Over at the New Republic, Nordell also has some sage advice: “At the very least, the default settings for these assistants should not always be women,” she writes. “Change Viv to Victor, and maybe one fewer woman will be asked to be the next meeting’s designated note-taker.”

Read more

More News