top of page
Search
Savannah

Hey Siri, are you sexist?

In 2019, a report named ‘I’d blush if I could’ was published by UNESCO. The name refers to popular AI assistant’s (Siri) response to ‘hey Siri, you’re a bitch’ before April 2019, revealing issues of sexism and gender stereotypes which prevail in the world of AI virtual assistants.



With most digital assistants embodying a feminine voice- such as Amazon’s Alexa, Apple’s Siri and Google’s Cortana- some argue this reinforces stereotypical notions of women being servile beings, who exist only as a means to serve people. Additionally, with women in the UK carrying out an average of 60% more unpaid domestic work than men, it doesn’t seem to be helping overcome gendered stereotypes or the gap in domestic labour.


The UNESCO report also points to how a considerable amount of verbal and sexual abuse is directed at voice assistants. A writer for Microsoft’s Cortana noted that a large amount 'of early-on enquiries’ probe the assistant’s sex life and Robin Labs, a digital assistant developing company, found that over 5% of interactions with AIs are sexually explicit. The worse thing about this is the unsettling, docile responses that the assistants are programmed to have, such as Siri’s previous response to ‘who’s your daddy?’ being ‘you are’. Richardson, the author of An Anthropology of Robots and AI, argues that it ‘reflects what some men think about women – that they’re not fully human’ . Others suggest the programming is indicative of how girls are taught to deal with sexual harassment in school: ‘by brushing it off’. Together with the recent UN Women UK survey revealing that 97% of women aged 18-24 have experienced sexual harassment, the crisis of sexual harassment (and normalising it) seems to be penetrating all areas of life, including virtual ones.


The question remains of why virtual assistants have been programmed to be submissive and servile, with feminine voices. According to the UNESCO report, it stems from a lack of diversity within the industry, with women making up only 12% of AI researchers and 6% of software developers. Furthermore, these companies Heads of Technology, whose products are used by billions of people, are all male. The Chief Technology Officer at Amazon is male; the Director of Solutions and Technology at Google is male; and the Director of Information Systems and Technology at Apple is male.


Having said this, some people, such as Karl MacDorman, an associate professor at Indiana University’s School of Informatics and Computing, claim that it's because research indicates that there is a greater acceptance of female speech. Additionally, when assistants respond pleasantly and politely, even in cases of harassment, it maximises a user’s desire to keep engaging in their device. Although these many help companies increase revenue, it is worrying on an ethical level.


Overall, the use of feminine voices and passive responses to sexual harassment from virtual assistants reinforce and legitimise gender stereotypes and sexual harassment. The evidence, of why this is, points to issues with how homogenous the technology industry is and societies patriarchal view of women in general.

35 views0 comments

Comments


Post: Blog2_Post
bottom of page