It is a fact: for years we’ve dragged a gender gap in all areas. From Hollywood to leadership roles in big companies, it sees that there is going to take several years to close this gap. 100 years, to be precise, according to the World Economic Forums. But 2017 and 2018 have been enlightening years and we have begun to become more aware about the situation and how women have been set aside and have been underestimated in their participation in health, education, economics and politics.

But what happens with the new areas?

We are talking about Artificial Intelligence (AI). The industry is creating a new generation of robots and artificial intelligence machines with patriarchal stereotypes, especially on virtual assistants, sexbots and autonomous weapon systems.

“The machines of tomorrow are likely to be either misogynistic, violent or servile” Samir Saran. Vice-President at ORF

Servile

In 2017, The Guardian reported that the sex tech industry is estimated to be worth $30 billion.

The female sex robots are able to hold a small conversation, be obedient, (they are designed to never say no), move their head and lips and even have an orgasm. And yes, you read it well: “female” sex robots, because despite the fact that the robot sex industry has been in the market for years, male sex robots have not been manufactured yet. A company promises to do it this year, but the demand is not so high as a female sex “companion”.

Although these sex tech toys are sold as “an alternative to accompaniment” they also open a door to the objectification of women and the belief that women have to please all the needs of men. It also opens the gate to all the unimaginable perverse needs like, for example, child sex robots.

Some reports says that these sex toys are used not specifically for sex, but to counter loneliness. The problem is that users can interact with these toys and extrapolate those interactions with real people.

Sexist

In the next decade an increased demand for virtual assistants is expected and although in prototypes like Siri, Cortana and Alexa the user can choose between a female and male voices; it still does not change the overall picture of male dominance and female servitude in AI.

If a conversation is held with any virtual assistant where sexist and insulting sentences are used, the AI Virtual Assistant reflects regressive and patriarchal ideas and it will respond with acceptance, complacency or apology for not understanding what is said.

When the same questions were asked to real women, they considered them insulting and disrespectful.

Moreover, the way in which these virtual assistants interact with the user  are inclined to filtering in order to connect more with the user.

Veronica Belmont, writing for Chatbots Magazine, summarises the issue well: “Gendering artificial intelligence makes it easier for us to relate to them, but has the unfortunate consequence of reinforcing gender stereotypes”

This is due to the nature of an AI’s existence, which needs to be accommodating and non-confrontational.

The paradox is that the female voice is preferred in virtual assistants, but when users interact with AI on issues related to finance or decision making, a male voice is almost always used.

The way that AI learns

The problem is what we are teaching to the machines. In order to create a personality for the AI machine, developers fill up the software with images, text, voices and data that corresponds to society. If you google “domestic cleaning”, 90% of the images will show a woman cleaning the house. That information goes to the machine and it creates a pattern of world-construction that is nothing more than the reflection of society.

Over time, these robots will assimilate behaviour patterns where, for example, if a robot is designed to work in recruitment agency and has to select a person to clean a house, their system will choose a woman. No for her skills, but because the data that is supplied to the software shows than women are more suitable to do that kind of job.

Two prominent research-image collections—including one supported by Microsoft and Facebook—display a predictable gender bias in their depiction of activities such as cooking and sports. Images of shopping and washing are linked to women, for example, while coaching and shooting are tied to men.

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

The other issue is that there are still not enough women involved in the AI creation process so that they can better direct the way these machines learn.

Perhaps it will take 100 years to erase the gender gap, we can start realizing about this kind of behaviour and patterns that we have repeated constantly since immemorial times.