Loading...

Media is loading
 

Start Date

24-6-2022 12:00 AM

End Date

24-6-2022 12:00 AM

Abstract

At the same moment that technology companies are being called to accountability for the discriminatory impacts of their technologies as a part of larger “ethical AI” conversations, so too is “Big Tech” being called to address (or redress) problematic gender stereotyping in their digital assistant technologies. To date, these interventions by industry leaders (i.e. Amazon, Google, and Apple), have been somewhat limited and primarily focused on the inclusion of “male voice options” for some of their major voice interfaces (e.g. Google’s assistant; Siri). Outside of these normative interventions, activists, researchers, advocacy groups, and technologists are exploring feminist AI projects (e.g. F’xa) as well genderless design for voice assistants and other related digital assistant technologies(e.g. Project Q and Pegg).

In this paper, I employ critical cultural frameworks informed by feminist, queer, and critical race theories to explore these multiple approaches to what I describe as refacing gender in voice interfaces. Refacing gestures to the attempt to repair, or renew, gender identity in the interface, with each of these projects employing different sets of gender logics as a point of departure. Refacing as a framework offers a starting place for considering how these various models of gender repair and renewal do political work in re-articulating gender-as-interface in the context of the broader political and cultural climate.

Bio

Miriam E. Sweeney is an associate professor of Library and Information Studies at the University of Alabama who focuses on critical digital media and information studies. Her research explores the anthropomorphic design of virtual assistants, voice interfaces, and AI through the lenses of race, gender, and sexuality; as well as issues of dataveillance and power in big data assemblages.

Share

COinS
 
Jun 24th, 12:00 AM Jun 24th, 12:00 AM

Refacing Gender in Digital Assistant Technologies

At the same moment that technology companies are being called to accountability for the discriminatory impacts of their technologies as a part of larger “ethical AI” conversations, so too is “Big Tech” being called to address (or redress) problematic gender stereotyping in their digital assistant technologies. To date, these interventions by industry leaders (i.e. Amazon, Google, and Apple), have been somewhat limited and primarily focused on the inclusion of “male voice options” for some of their major voice interfaces (e.g. Google’s assistant; Siri). Outside of these normative interventions, activists, researchers, advocacy groups, and technologists are exploring feminist AI projects (e.g. F’xa) as well genderless design for voice assistants and other related digital assistant technologies(e.g. Project Q and Pegg).

In this paper, I employ critical cultural frameworks informed by feminist, queer, and critical race theories to explore these multiple approaches to what I describe as refacing gender in voice interfaces. Refacing gestures to the attempt to repair, or renew, gender identity in the interface, with each of these projects employing different sets of gender logics as a point of departure. Refacing as a framework offers a starting place for considering how these various models of gender repair and renewal do political work in re-articulating gender-as-interface in the context of the broader political and cultural climate.