![]() Proceedings of the 3rd Workshop on Gender Bias in Natural Language ProcessingĪssociation for Computational Linguistics In particular, our results show that system outputs are ambiguous as to the humanness of the systems, and that users tend to personify and gender them as a result.",Īlexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants We find that, while some companies appear to be addressing the ethical concerns raised, in some cases, their claims do not seem to hold true. ![]() Cite (Informal): Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants (Abercrombie et al., GeBNLP 2021) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: Code = " responses and the extent to which they generate output which is gendered and anthropomorphic. Association for Computational Linguistics. ![]() In Proceedings of the 3rd Workshop on Gender Bias in Natural Language Processing, pages 24–33, Online. Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants. Anthology ID: 2021.gebnlp-1.4 Volume: Proceedings of the 3rd Workshop on Gender Bias in Natural Language Processing Month: August Year: 2021 Address: Online Venue: GeBNLP SIG: Publisher: Association for Computational Linguistics Note: Pages: 24–33 Language: URL: DOI: 10.18653/v1/2021.gebnlp-1.4 Bibkey: abercrombie-etal-2021-alexa Cite (ACL): Gavin Abercrombie, Amanda Cercas Curry, Mugdha Pandya, and Verena Rieser. In particular, our results show that system outputs are ambiguous as to the humanness of the systems, and that users tend to personify and gender them as a result. We also examine systems’ responses and the extent to which they generate output which is gendered and anthropomorphic. We compare these claims to user perceptions by analysing the pronouns they use when referring to AI assistants. ![]() Some have claimed that their voice assistants are in fact not gendered or human-like-despite design features suggesting the contrary. Abstract Technology companies have produced varied responses to concerns about the effects of the design of their conversational AI systems. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |