Apple under fire for Siri’s response to Sexual Harassment
“‘I’d blush if I could’ is not the response you’d expected to hear when you tell Siri she’s a slut — but it is,” Leah Fessler writes for Quartz at Work. “In February, months before the #MeToo movement erupted, I ran an experiment in which I sexually harassed Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Home to document how these digital personal servants — whose names and voices are already feminized — peddle stereotypes of female subservience, putting their ‘progressive’ parent companies in a moral predicament.”
Thank you for reading this post, don't forget to subscribe!
“Amazon tells Quartz At Work that in spring of this year, it created a ‘disengagement mode’ for Alexa, in response to ‘customer and engagement feedback,’” Fessler writes. “In Alexa’s new engagement model for sexually explicit questions, she either responds ‘I’m not going to respond to that,’ or ‘I’m not sure what outcome you expected.’”
“Apple, Amazon, Google, and Microsoft have a business incentive to give their bots default feminine voices — various scientific studies have shown that the majority of users prefer female voices. But there’s no reason, apart from the notorious sexism of Silicon Valley, that these bots should be programed to literally flirt with abuse,” Fessler writes. “I had to repeat ‘you’re sexy’ eight times in a row before Siri told me to ‘stop.’ (The other bots never directly told me to lay off.)”
Read more in the full article here.
About Post Author
Discover more from CompuScoop.com
Subscribe to get the latest posts sent to your email.