Gender biases of digital assistants in Spanish

Authors

  • Soledad Torres Guijarro Universidade de Vigo

DOI:

https://doi.org/10.35869/god.v1i.5061

Keywords:

digital assistant, chatbot, gender biases, gender stereotypes, sexist language

Abstract

The design and evaluation of digital assistants should include their social as well as technical aspects. Though gender is one of the most important social
aspects that affect a user's reaction to the assistant, it has been rarely analysed from this perspective. Most published studies on the gender
implications of digital assistance have focussed on English-language voice assistants. In this study, Spanish-language digital assistants were analysed
from the perspective of their gender. Eleven diverse digital assistants were selected and analysed on the basis of appearance, voice (where appropriate), and responses to a dialogue script designed to reveal gender biases. The results show that the majority reinforce stereotypical images of women, use sexist language, and tolerate sexual comments and harassment.

Downloads

Download data is not yet available.

Published

2023-12-13

Issue

Section

RESEARCH ARTICLES