This Johns Hopkins University study investigates how the perceived gender of a voice assistant impacts user behaviour and the assistant’s error mitigation capabilities. It reveals that voice assistants that apologize are seen as warmer than those offering compensation. Male participants favoured apologetic female assistants over male ones and interrupted voice assistants more frequently, regardless of the assistant’s gender. The study suggests that the perceived gender of a voice assistant biases user behaviour, particularly among male users, and that using an ambiguous voice could reduce these biases.

 

Publication date: 19 Oct 2032
Project Page: https://arxiv.org/abs/2310.13074v1
Paper: https://arxiv.org/pdf/2310.13074