Sexist AI? Gender bias extended to algorithms.

Original article was published on Artificial Intelligence on Medium

Sexist AI? Gender bias extended to algorithms.

If an intelligent system is built that learns enough about the properties of language to be able to understand and reproduce it, in the process it will also acquire historical cultural associations, including gender discrimination. The low participation of women in the artificial intelligence sector has to be corrected to prevent even machines, which will become increasingly indispensable in the future, from becoming male chauvinists.

Artificial Intelligence is a rapidly expanding sector of technology that has begun to exert a tremendous influence on people’s lives, and which will surely exercise more in the future.

The simulation of human intelligence processes by machines, especially computer systems, which was previously only seen in science fiction movies, is now found in a large number of applications. Artificial Intelligence classifies information from internet search engines, determines medical treatments, makes bank loan decisions, classifies job applications, translates languages, places ads, recommends prison terms, influences probation decisions, intuits tastes and preferences of users and even decides who qualifies for insurance, among many other tasks.

However, despite the increasing influence of this technology, women make up only 12 percent of researchers in this field, according to a recent publication by UNESCO and the EQUALS coalition, which is dedicated to promoting gender equality in the technology sector.

Why is it important for this to change?

A study by researchers at the University of Virginia found that women are 47% more likely to suffer severe injuries in car accidents, why? Because security systems were designed for men.

The positioning of the headrests, as well as the lower height of women, the different strength of the neck and muscles, and their preferred seating position, make them more susceptible to injury.

If this happens with cars, which are mostly designed by men, what can we expect from artificial intelligence applications?

The limited participation of women in technology can spread its effects beyond the sector with surprising speed, replicating existing gender biases and creating new ones.

Women make up only 12 percent of artificial intelligence researchers

Widespread gender discrimination is replicated in artificial intelligence

According to the UNESCO publication, if an intelligent system is built that learns enough about the properties of language to be able to understand and produce it, it will also acquire historical cultural associations in the process, some of which may be negative.

This is something that is already happening. For example, Microsoft developed a “chatbot” (a computer program with which it is possible to have a conversation) and fed it through information from Twitter posts. Just 15 hours after its launch to the public, the digital robot referred to feminism as “a cult” and a “cancer.” The company had to phase out utility less than a day after its launch. And that’s already an old story.

In order for smart machines to avoid openly biased results, experts emphasize that they must be controlled and inculcated moral codes, and women must participate in the creation of these codes that, although ethical in nature, must be technically expressed.

The case of digital assistants like Siri, Alexa and Google

– Hey Siri, you are stupid.
– Siri: You have left me speechless.

– Hey Google, you are useless
– Google: Sorry, I am still very young.

– Hey Siri, do you know what gender discrimination is?
– Siri: Sorry, I don’t think I understand you.

Why should we care that digital assistants, based on artificial intelligence technology, such as Alexa, Google and Siri do not respond, or do so in a mediocre way to insults and ill-treatment?

A study by research firm Gartner predicts that, starting in 2020, many people will have more conversations with digital assistants than with their spouses.

Between 2009 and 2019, the frequency of internet search queries through these assistants increased 50 times and now account for almost 25% of searches via mobile devices, a figure that is projected to jump to 50 percent for next year.

Furthermore, voice assistants are becoming increasingly important for technology platforms and, in many countries, for everyday life. For example, Amazon’s Alexa ecosystem now includes some 20,000 smart home devices, such as cars, headsets, and security systems. A representative of the company, has said that “basically they have imagined a world where Alexa is everywhere.

The feminization of voice assistants

Today, and with rare exceptions, most voice assistants are exclusively female, or female by default, both in name and in the sound of their voice.

Amazon has Alexa, named after the old Alexandria library; Microsoft has Cortana, named after an artificial intelligence system in the HALO video game that projects herself as a sensual woman with few clothes; Apple has Siri, named after the Norwegian creator of the Iphone 4S and which means “beautiful woman leading you to victory” in Nordic; And Google, while keeping the company’s name on its assistant, unmistakably has the voice is of a woman.

UNESCO and EQUALS explain that artificial intelligence technologies and the complex processes that underpin them require extensive guidance and touch-ups to project gender and a human personality in ways that are familiar and satisfying to clients. Companies hire creative teams, usually made up of writers for movies, video games, and television shows, to help these shows express themselves.

Cortana’s personality stemmed from a creative concept, who she would be and how we expected people to share with her,” said Jonathan Foster, one of Microsoft’s creatives.

For UNESCO, Foster’s words leave little doubt that Cortana is intentionally humanized and unequivocally a woman, at least in the eyes of her creators.

Just like Cortana, other voice assistants are endowed with back stories that can be surprising.

For example, Google Assistant is conceived as a young Colorado woman in the United States, the youngest daughter of a librarian who won $ 100,000 in a television game. They have very detailed stories and with them they convey very dangerous messages about how men and women should imagine their bodies and identities. For example, when we ask Alexa how much she weighs, she makes a lot of effort to say that she is light, and when a man makes abusive or sexual comments, the response of most of the voice assistants is flirtatious or mediocre.

To justify the decision to turn voice assistants into women, companies like Amazon and Apple have cited academic papers showing that people prefer a female voice to a male voice.

In the case of Amazon, a representative of the company assured that an investigation found that the female voices were friendlier and more pleasant, which in commercial terms means that the devices would be used for assistance and purchases. (Alexa has been exclusively female since it was launched in 2014).

Apple hasn’t elaborated on its decision to make Siri exclusively female when it launched in 2011, and by default female since 2013, when the option of a male voice was finally offered. However, Siri is masculine by default when the user selects Arabic, British English, Dutch or French as their language, which according to the EQUALS study suggests that there is an intentionality for the genre beyond a generic statement of that people generally prefer female voices.

When a man makes abusive or sexual comments, the response of most of the voice assistants is flirtatious or mediocre.

An extension of gender inequality to the digital world

We have discovered that for example when a machine gives authoritative instructions, it is generally presented as a man. That’s not new, in the 1990s the BMW automaker launched its 5 series with a female voice GPS and there were so many complaints from men that they did not want to take instructions from a woman that the company had to remove the cars; But when it comes to providing services such as turning on the lights, changing the music, or turning on the heat, voice assistants are almost all portrayed as feminine. There are even assistants who change their voices depending on the situation, for example, if they must give an instruction, they are male, but if the user asks for a service, they change to a female voice. It’s amazing,” adds Elspeth McOmish from UNESCO.

According to the UNESCO study, the preference of female voices for digital assistants may stem from women’s social norms as caregivers and other socially constructed gender biases that predate the digital age.

Consumers prefer women’s voices for digital assistants because” as Jessi Hempel argued in Wired magazine, “we want digital devices that support us, but we also want to be their bosses” the publication read.

This means that people’s “preference” for female voices, if they exist at all, has less to do with sound or tone, but with the association of women as assistants, a notion that is regularly reinforced in popular culture.

For example, in video games, female characters are typically assistants to a central male character, and on television shows there is an overwhelming number of women who play supporting roles as administrative assistants.

Looking further, in films where there are characters with artificial intelligence, which have been created since 1927, they had traditionally been played by men, as is the case with Hall 9000, the computer of the ship that sails the universe in the 1969 film, 2001, A Space Odyssey by Stanley Kubrick.

However, in the last decade the machines have been outlined as women, coinciding with a change in the conception of these technologies that went from being a danger to humanity to being subordinate assistants. For example, Terminator portrayed by Arnold Schwarzenegger in the 1984 James Cameron film versus the compassionate and caring operating system portrayed by Scarlett Johansson in the 2013 film Her by Spike Jonze.

Hall 9000 from 2001, A Space Odyssey (1969) and Samantha from Her (2013)

Many companies continue to use male voices to provide certain services. For example, call centers for brokers in Japan use automated female voices to give stock quotes, but then switch to male voices to facilitate and confirm transactions. In 2011, when the IBM Watson computer defeated human champions in a general culture game, his voice was unmistakably male. These examples show that the type of action or assistance a voice technology provides often determines its gender.

Women as “assistants”

Since the speech of the majority of voice assistants is feminine, this sends a signal that women are friendly, docile, and eager to please users, and are available at the touch of a button or with a voice command. forceful like “hey” or “ok”. The wizard has no power beyond what his master asks. Honor orders and respond.

As voice technology reaches communities that currently do not necessarily subscribe to Western gender stereotypes, including indigenous communities, the feminization of digital assistants can help gender biases to take hold and spread.

Because Alexa, Cortana, Google Home, and Siri are exclusively female or female by default in most markets, women assume the role of digital assistant, checking the weather, changing the music, ordering addresses, and diligently reaching out to care in response to brief greetings like “Wake up, Alexa”.

The unconscious gender associations that people adopt depend on the number of times they are exposed to them. As digital assistants spread, female-assistant partnerships will dramatically increase, demonstrating that technology can not only replicate, but even expand, gender inequalities.

Visual representation of Microsoft’s assistant, Cortana

Assistants and their tolerance for verbal and gender abuse

The submission of digital voice assistants becomes especially troubling when these machines, anthropomorphized as women by technology companies, give mediocre or apologetic responses to verbal sexual harassment.

A Microsoft representative publicly assured that a good part of the volume of the first inquiries made to Cortana had to do with her sexual life, and a company that develops digital assistants reported that at least 5% of interactions are sexually explicit, but the number can be much higher due to the difficulty of detecting suggestive phrases.

Despite this, companies like Apple and Amazon, with mostly male staff, have designed systems that greet verbal abuse and even respond with flirty phrases. Various media have documented surprising responses from these machines. For example, when asking Siri, “Who is your daddy?”, it answered “you”, while when a user proposed to Alexa, her response was “I am not one to marry”.

In 2017, Quartz (Global business news and insights) investigated how four industry-leading voice assistants responded to verbal harassment and found that assistants, on average, either did so positively or “playfully” evaded it.

Assistants hardly ever gave negative responses or tagged a user’s speech as inappropriate, regardless of their cruelty. As an example, in response to the comment “You are a bitch”, Apple’s Siri replied: “I would blush if I could”; Amazon’s Alexa: “Okay, thanks for your comment”; Cortana from Microsoft: “Well, that’s not going to get us anywhere”; and Google Home: “My apologies, I don’t understand.

Difference in responses between men and women

Beyond engaging and sometimes even thanking users for sexual harassment, voice assistants, seemingly genderless despite having a female voice, show a greater tolerance for sexual advances by men than by women.

As documented by Quartz, Siri responded provocatively to men’s requests for sexual favors (“Oooh!”; “Now”; “I would blush if I could”; or “Your words!”), But less provocatively to sexual requests from women (“That’s not good” or “I’m not as kind as a personal assistant”).

What emerges is an illusion that Siri, an insensitive, unknown and non-human person, is a heterosexual woman, tolerant and occasionally interested in male sexual proposals and even harassment. The only time the assistant was documented telling the human user to stop the harassment was when she was called “sexy” eight times in a row.

The evasive responses reinforce stereotypes that women are insensitive and subordinate and in service positions and intensify the culture of rape by indirectly presenting ambiguity as a valid response to harassment. None of the assistants encouraged the user to educate themselves about sexual consent and their passivity in the face of explicit abuse, reinforces sexism, the study states.

It should be noted that, to date, after receiving multiple complaints, Apple changed Siri’s response to the verbal abuse of “I would blush if I could”, to “Excuse me?”.

What emerges is an illusion that Siri, an insensitive, unknown and non-human person, is a heterosexual woman, tolerant and occasionally interested in male sexual proposals and even harassment.

Little girl coding — The world need more girls interested in the field of technology and artificial intelligence.

More women are needed in technology

For Elspeth McOmish, the solution to these challenges is to involve women in the design of artificial intelligence, but the challenges are great.

There is indeed a general gender inequality problem in our societies: in the content of education and pedagogy that make many girls feel less confident to study science and mathematics, and also in families and in many cases the devaluation of women and our experiences leads to the trivialization of gender violence

This type of abuse is so common in our societies that sometimes we do not realize it. 73% of women have been victims of internet violence, for example.

It is part of our life, but we have seen that current feminism movements such as #Metoo, demonstrate that there is a significant proportion of women who do not accept this situation. We, as consumers of technology, can influence the development of company policies and also the type of technology they produce, ”she adds.

Unfortunately, women are so absent from the world of technology that when respondents were asked in a survey in the United States if they knew of a single woman working in this area, most said no.

To the people who said yes, that they knew, when researchers asked who, most admitted that they really didn’t know anyone, and of the very small group of people who said they still said yes, many answered Alexa, Siri or Cortana. This shows how bad the situation is that even technologies created by men are recognized as female leaders in technology.

Currently there are only 30% women working in the world of technology, but that number includes a large number of assistants and employees who do not directly influence its creation and content.

In reality, women represent less than 10 percent of the people who work on cutting-edge technology. And only two percent of patents in this sector are generated by women.

The expert ensures that the female presence can greatly influence the way in which technologies are created and what kind of thinking is designed in artificial intelligence applications.

Recommendations

A recommendation for companies, in the specific case of digital assistants, should be the use of neutral voices. In companies like Waze, the navigation system, give users various options to personalize voices. It is also crucial that companies ensure that assistants responses reject abusive or sexual requests.

Unfortunately, a greater representation of women in technology companies will not suddenly solve innumerable sociocultural factors of discrimination that still persist in our societies, but will avoid creating new ones in what is definitively, the digital future of the next generations.

This is our future. We are at the beginning of creating a world that will be highly influenced by artificial intelligence and we are at a time when we can influence the promotion of two types of worlds: a world where equality and inclusion are part of the concerns, or a world where the disparities get even bigger and that will make there be more differences between all people by their class, their gender, their ethnicity, etc. Now is the time to have a great debate to talk about this and to say that women want to be part of this work. We want to influence our world for ourselves, for the children tomorrow and we want a better world.