A few more bits on how automation gets gendered in particular kinds of contexts and settings. In particular, the identification of ‘home’ or certain sorts of intimacy with certain kinds of domestic or caring work that then gets gendered is something that has been increasingly discussed.
Two PhD researchers I am lucky enough to be working with, Paula Crutchlow (Exeter) and Kate Byron (Bristol), have approached some of these issues from different directions. Paula has had to wrangle with this in a number of ways in relation to the Museum of Contemporary Commodities but it was most visible in the shape of Mikayla, the hacked ‘My Friend Cayla Doll’. Kate is doing some deep dives on the sorts of assumptions that are embedded into the doing of AI/machine learning through the practices of designing, programming and so on. They are not, of course, alone. Excellent work by folks like Kate Crawford, Kate Devlin and Gina Neff (below) inform all of our conversations and work.
Here’s a collection of things that may provoke thought… I welcome any further suggestions or comments 🙂
Alexa is female. Why? As children and adults enthusiastically shout instructions, questions and demands at Alexa, what messages are being reinforced? Professor Neff wonders if this is how we would secretly like to treat women: ‘We are inadvertently reproducing stereotypical behaviour that we wouldn’t want to see,’ she says.Prof Gina Neff in conversation with Ruth Abrahams, OII.
it has been reported that female-sounding assistive chatbots regularly receive sexually charged messages. It was recently cited that five percent of all interactions with Robin Labs, whose bot platform helps commercial drivers with routes and logistics, is sexually explicit. The fact that the earliest female chatbots were designed to respond to these suggestionsVidisha Mishra and Madhulika Srikumar – Predatory Data: Gender Bias in Artificial Intelligence
deferentially or with sass was problematic as it normalised sexual harassment.
“Consistently representing digital assistants as female…hard-codes a connection between a woman’s voice and subservience.”Stop Giving Digital Assistants Female Voices – Jessica Nordell, The New Republic