A gender violence it is present in various areas of society, and many women suffer from it every day. Unfortunately, not even virtual assistants manage to avoid experiencing it.
It turns out that assistants like Siri, Alexa, Cortana and Google Assistant tend to respond leniently to requests for sexual favors, and are often even grateful. Because they have female voices – all of them – these responses reinforce a severe gender stereotype. This just shows how far technologies still need to go to overcome these stereotypes.
see more
Alert: THIS poisonous plant landed a young man in the hospital
Google develops AI tool to help journalists in…
That's what Beatriz Caetano – UX Writer from chatbots – says in an interview: “Artificial intelligence tools strengthen the idea that women are helpful, docile, servile and tolerant of any treatment”.
The subject is so important that it was the subject of a panel at The Developer’s Conference (TDV) Innovation, which took place in Florianópolis – SC and has been raising alerts at Unesco about the necessary changes in this scenario.
According to research by the organization, which was published in 2019, nearly half of conversations people had with bots focused on physical appearance. And this kind of conversation mostly happens with female bots. The survey also showed that 18% of these interactions are related to sexual issues.
“Presenting a bot as a woman can give people, especially men, the opportunity to break sexual taboos and boundaries. without worrying about the consequences", said Beatriz to justify people's preference for bots with a voice or appearance feminine.
Most of the large companies that produce this type of technology already give the technology a feminine appearance from the factory, and it doesn't even allow gender switching.
But, according to Beatriz, this type of technology is still under development, and its development teams must introduce codes of conduct to curb this type of violence in the technological environment, in order to also modify behavior in the world real.
This was one of the measures that Bradesco bank took after noticing the high rate of harassment that its virtual assistant, Bia, suffered in the period of one year. The bank decided to create a campaign against sexual harassment, in addition to programming strong responses to be given by the assistant in these cases.
“We can do this in many ways. First, avoiding gendering or personifying bots. It is also important to diversify the teams, including women in the development of these products; promote ethical discussions in companies and stimulate support tools in the identification of gender bias in the design of chatbots”, explained Beatriz.
Lover of movies and series and everything that involves cinema. An active curious on the networks, always connected to information about the web.