“I’d blush if I could.” That was Siri’s response to insults from users, until April of this year. Not a response with which Siri discourages users to call her names (I refer to Siri as if she is a woman since she has a female voice, and because she is treated as such by users). As of April 2019, Apple changed Siri’s response into “I don’t know how to respond to that”, which is better but unfortunately still not showing disapproval for the users’ behavior.
“I’d blush if I could” is also the title of a report published by UNESCO. This organisation conducted research into the behaviour of virtual assistants, and in particular the behaviour of virtual assistants with female voices. UNESCO found that, besides Siri, more virtual assistants are often projected as young women, including submissive traits. It was found that because of these submissive traits gender imbalances are preserved.
Of course, it is not wrong to picture virtual assistants as women when a female voice helps building a helpful, supportive, trustworthy assistant, like Microsoft did with Cortona. However, apparently, a lot of companies choose a female voice for these kinds of reasons and subsequently programme their virtual assistant as a submissive one. That is the wrong way to go, because when a virtual assistant responds timid to an insult, the standardisation of sexism in daily life will increase.
But why are those female virtual assistants programmed with submissive traits? UNESCO indicated that women are very scarce in teams that develop such technological tools. In Silicon Valley, where a lot of these products with artificial technology are developed, the vast majority of the employees are male. UNESCO’s researchers think that because there are almost no women working on these technologies, sexism is maintained in the technological devices. Another striking finding is that the use of female voices in technological devices makes it look like women are responsible for possible mistakes in the hardware, while these technologies are mainly programmed by men.
In order to reduce sexism in technological devices, the researchers formulated some advices. First of all, virtual assistants should not be programmed with a female voice by default. Think of Q, the world’s first genderless voice. Secondly, virtual assistants should be programmed in such a way that insults will be strongly disapproved. Moreover, more women and girls should be encouraged to develop their technological skills to be able to deliver more female input in the development of new technologies.
Question is whether those big technological companies listen to such an advisory report written by UNESCO researchers. I personally think that the government should take its responsibility as well, for example by making certain regulations and by stimulating girls and young women to develop their technological skills in school. Nevertheless, it is not only in the companies’ and government’s hands, it is also our own mentality that should change.