The problem with digital assistants like Alexa is not sexism ... it's that they are not real human beings
Women today face enough problems in the world without picking a fight with AI robots, writes Wendy Grace
Apparently, the latest thing that women need protecting from is gendered technologies. According to a recent report from a UN agency Unesco, digital assistants - like Amazon's Alexa and Apple's Siri - are entrenching harmful gender biases and encouraging sexual harassment.
Female victimhood is now so pervasive that artificial intelligence might impact our ongoing fight for equality and men are such pigs, with an insatiable desire to sexually harass women, that they will even do it, if permitted, to a robot with a female persona.
The report, entitled I'd Blush If I Could, is a response that Siri gave when told "Hey Siri, you're a bitch", which authors say is indicative of "apologetic, or lacklustre" responses of these ''female'' assistants when insults are ushered to them.
Except you can select a male voice in these devices and, recently, Siri has been updated to respond to such comments with "I don't know how to respond to that".
The report also highlights that, when Amazon's Alexa is called a "slut", the voice assistant replied "thanks for the feedback".
Again, Alexa has had an upgrade, which will now default, when met with sexually explicit, or abusive, comments, to "I'm not going to respond to that".
The other big player, Google, now has six voices that are not assigned a name, but a colour and, last March, the first gender-neutral AI assistant, Q, was launched.
Please log in or register with belfasttelegraph.co.uk for free access to this article.
One problem, according to the report, is that AI engineering teams are made up overwhelmingly of men. It's their nasty, unconscious bias that is the problem. I'm all for encouraging and empowering more women to participate in the STEM workforce, but the idea that this would solve sexism might be a little naive.
Women might like to convince themselves that diversity will lead to change, except that hasn't happened: for example, when it comes to women's magazines, which are made up mostly of advertisements featuring suggestive images of women in submissive and sexualised poses, these are often edited and put together by women and mostly read by them.
You might ask why a committee of the UN has been spending their time looking at sexist robots. I suspect the same UN group would have an issue if Alexa was Alex, because now endless wisdom would be coming from men not women.
It is frustrating to think of resources and expertise being spent in this way when you consider the many issues affecting women in UN countries today - places where women don't have full access to education, are not allowed to drive, or where gender-based violence is a huge issue. Sexism should matter first and foremost when we are talking about real women.
The report states that around 5% of interactions with AI are sexually suggestive, a statistic which I find rather creepy and pathetic. Ultimately, someone who gets their kicks from asking an algorithm to talk dirty to them has plenty of avenues to indulge such desires and the suggestion that we can change strange behaviour by adjusting computer code in these machines is absurd.
It is nothing new that technology can be used to indulge our darker side. If someone wants to use it for this end, you don't need to look very far to see that the pre-programmed responses of a plastic, metal and fabric cylinder are the least of our worries.
The report asserts that these devices are ''teaching gender stereotypes''. Perhaps we don't want to address the real problem, that we are replacing people with products, we are learning how to interface with the world through technology rather than real relationships.
So are millennials doomed to learn sexism from these artificial assistants? Before we could even comprehend this type of technology, the world has been smothered with gender stereotypes in, arguably, much more powerful mediums; from music, to movies, to video games. It's probably a war we are never going to win.
So, perhaps, we should be fighting other battles, like rather than pondering whether you can sexually harass code, analysing the root causes as to why sexual harassment in the real world is happening in the first place.
One scary prediction is that people will speak to a voice assistant more than to their partners in the next five years - in other words, we will interact with robots more than people.
The real story here is the degradation of human relationships and interactions, the ones that help us want to strive to recognise the innate value of each person.
The connections that bind us and the relationships that influence us are being eroded and what can stop this is not an automated voice, but real human beings focused on relating to one another rather than a machine.
A justified fear is for our children, who will have this type of technology so entrenched in their lives that it will have a powerful influence.
This certainly will be true if we don't aim to change the reality that we are often subcontracting parenting to technology and this issue has the potential to be much more damaging than any of the assertions made in this report.
Last week, it was reported that only one-third of parents read a bedtime story to their children, with a quarter of parents asking devices like Alexa to read their children a bedtime story. When you include YouTube and other apps, more than half of parents are substituting devices for themselves.
I hope to be a good role model to my son - primarily he will learn from his mum and dad. I will teach him myself that calling a woman (or even a female-sounding object) a "slut" is always wrong.
I would rather that lesson come from me than from an automated response and my choice will be to avoid these devices altogether.
One recommendation in the report is that these devices should be designed to discourage ''gender-based abuse''. So, not only are we delegating parenting, but we are expecting a robot, rather than parents, friends and society at large to educate one another on basic human decency.
Of course, if we continue to let technology erode real-life relationships, it will have a profound sociological effect on how we all interact with one another. No amount of UN reports and recommendations will change this trajectory.
There will always be influences that will try and compete with my aspiration of authentic equality and respect between the sexes, but I would rather spend my time and energy in the real world, deciding what media I consume and focusing on my relationships and interactions with my family, friends and peers than on trying to wage war against these apparently misogynistic machines.