The smart assistants sold by big tech companies have a sexism problem. Their names and voices are default female, their answers to questions have information that’s , and — especially in the early days — they answered provocative or abusive queries with cheeky responses like “I’d blush if I could.”
Companies have addressed these criticisms (including from the U.N.) in some ways: You can make Siri’s voice male, and Alexa now sexually suggestive or harassing queries with answers like “I’m not going to respond to that”; previously, Alexa said “Thank you.”
It’s not enough.
In a new book published this September by the MIT Press called The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot, two researchers chart the ways that AI, robots, and other digital devices have been assigned — and play — the roles that were typically the realm of the 1950s housewife (or at least what people imagined her to be). Meet the Smart Wife: she handles housework, cares for family members, and provides companionship (and even sometimes sex).
The authors posit that the existence of the “smart wife” is a consequence of myriad societal phenomena, all happening at once: A shortage of women who want to be that picture perfect housewife, a society that demands multiple incomes per household, a devaluing of domestic work and the unique value that a human being brings to it, the greed and marketing savvy of tech companies, and more.
That’s why viewing these devices as “smart wives” — without masking the way they are gendered through gender-neutral specs — is a clearer way to see this technology for what it is: A perpetuation of male-dominated society, not merely consumer devices.
We sat down with the authors of The Smart Wife, Jenny Kennedy and Yolande Strengers, to learn more about the role that the smart wives play in the world, and how we can reboot them for the better. You’ll never talk to your smart wife in the same way again.
This conversation has been edited for length and clarity.
Mashable: What is a “smart wife”?
YS: A smart wife is a form of robotic, AI, or digital device that is taking on the traditional wifely labors in the home. We mean tasks like housekeeping, homemaking, emotional labor and care, and also intimate care including sex. We look at a range of smart wives in the book starting from digital voice assistants all the way to sex robots, and we look at the similarities between these devices and traditional forms of wives in different societies, particularly the 1950s housewife, and how these devices are being brought in to do some of those traditional wifely roles in the home.
JK: We also considered under the smart wife any kind of representation of these kinds of devices in popular culture, like sci fi movies, tv shows that feature some kind of feminized AI and robotic device [such as the Stepford Wives or even Austin Powers’ fembots].
YS: The design and inspiration for the smart wife in its physical form has actually come from fictional stories and popular culture. [Editor’s note: The authors write in their book “Alexa’s inspiration came from the computer voice featured onboard the Starship Enterprise in the 1960–1990s’ Star Trek franchise. In fact, her company’s founder and CEO, Jeff Bezos, is also a Trekkie—another link between popular science fiction and the smart wives of Silicon Valley.”]
We have a very narrow form of femininity on show here, and being programmed into a wide range of devices, and that’s what we’re primarily concerned with.
What is it about a machine performing activities that aren’t inherently gendered — but have become so in our cultural imagination — that makes those actions by the machine gendered?
YS: The very things that we’ve given to robots and digital assistants and AI to do are the traditional feminized labors of the home. That in itself is an interesting set of decisions about what these devices are being brought in to do. There’s an assumption there, too, that this kind of labor — particularly emotional and caring labor — can be outsourced to a device, and to a machine. The fact that we say, oh, well we could give that over to a robot, or we could give that over to a machine, instead of looking at whose role it is to do that, which genders predominantly do that, and how maybe we should be addressing labor dynamics in the home and beyond, the answer it seems is the smart wife. The question that we raise is, should it be?
The other big one is their form. The curviness of robotics, and the cylindrical kind of design, and in some cases, the very overt gendering and sexualization of devices is very is extremely feminine in a very stereotypical sense. It does occupy a range. There’s everything from the curves of the device to a sexualized kind of sex robot or gynoid device that’s feminized.
JK: Another example that is opposite to that is a Roomba. The device of itself doesn’t have a gendered name or a voice. And yet the for it is Rosie, a female, kind of 1950s character [Editor’s note: Rosie the housekeeping robot from the Jetsons was also for Roomba’s creator]. In the popular imagination, there is a connection between these very ambiguously non-gendered devices, and who performs that kind of work in the home. And they’re constantly being reconnected.
Mashable: What is the effect of connecting digital assistants with the imagined work of “wives”?
JK: The underpinning issue is that it reinforces the way in which domestic and reproductive labors are currently valued or devalued. It doesn’t contribute to a reimagining of how important those labors are, and how necessary they are, and how valued they should be.
YS: There are so many effects. Some relate to how we treat our devices. When they have these friendly feminine serving personalities, users perceive that they are more open to abuse. The problem is not that we abuse the device, but what we’re doing is abusing a sort of feminine form, which can then reinforce broader sexism and abuse in society from casual insults right through to how these devices get enrolled in domestic violence situations. [Editor’s note: The authors discuss in their book the role of technology, including smart assistants, in domestic violence, including studies that show how “mobile technologies provide opportunities for perpetrators to create a sense of “omnipresence” to isolate, control, stalk, humiliate, punish, and abuse women in domestic violence situations.”]
There’s a whole basket of effects.
Mashable: Would making these devices gender neutral make them less sexist?
JK: No, because what that ends up doing is detracting from the way that there are sexism issues in AI and in the AI industry. If we just make everything gender neutral, then what we’re not doing is addressing everything that underpins the work they do, the way we as a society value that kind of work, the way in which different professions and different work is stratified.
YS: We’re not against the idea of gender neutral voices. But it’s not really possible to make them gender neutral. Just by making their voice gender neutral doesn’t actually make them gender neutral. What we see happening is people put a gender neutral voice into a device, or provide gender neutral voice options, and then they say, oh OK, problem solved. And we just don’t think that’s good enough.
What specifically would making these devices “gender neutral” paper over?
JK: It’s overall trying to divert responsibility for essential domestic and caring responsibilities onto smart wives, rather than consider as a society as a whole, how we’re valuing and how we’re addressing that work and that need.
YS: In some cases they’re certainly useful. We recognize that there are some benefits of having these devices. But whether they’re the solution to aged care, for example, or parenting children, or doing housework, is a huge question. And somehow we’ve just jumped to smart wives as the answer, because it seems like an easy solution. But in all cases, it’s not the best solution. In some cases, we do need humans. We can’t simply assume that these labors can and should be outsourced to robotic and automated technologies.
Whether overt or more covertly, does keeping AI linked with the “wife” serve a purpose?
YS: The gender stereotyping of these devices serves a purpose. The companies that are doing this use the feminine form because it’s something that people are familiar with, it’s non-threatening, it’s more friendly, we find it comforting. So we’re more likely to accept devices that have a feminine personality into our lives and into our homes. And that gets us using them. But that creates potential vulnerabilities around . But it also overlooks the effects that these devices have in the world that extend far beyond what we see in our immediate lives and environment.
Research has shown how devices are damaging to the environment, and particularly to marginalized people who are mining the minerals, or producing them in manufacturing centers, or being somehow exposed to the e-waste on the other end. One of the problems with these devices’ feminine forms is they kind of mask those effects for us. They make them seem like they’re doing good in the world. They leave us numb to these effects, and certainly protect us from seeing the true effects of these devices.
Would you say the feminine approachability is the trojan horse for the way we come to accept these devices?
JK: Absolutely. These corporations want these devices in our homes, they want us to welcome them in. And so making them as appealing and as friendly and as innocuous as possible is a major factor in that.
How can the tech industry and consumers make AI less sexist?
JK: Basically how do we change patriarchy? This is a whole society problem, it’s just not the people working in the tech industry, or the people purchasing the devices. It’s the broader society and culture that they sit within as well. There isn’t one simple fix.
In our book we give nine proposals as part of our Smart Wife “reboot” manifesta, which is a starting point for thinking about what needs to change, but with some quite specific recommendations.
One suggestion is thinking about how we’re engaging with the technology, how we’re representing the technology. What about programming a device to push back against users that decide to call them a bitch? It’s all about the microaggressions, and the way they feed into larger social problems. And so one of the ways we can do that is think about whether these devices push back in a way they want them to go, or do they just keep allowing the same old behaviors?
There are already some great examples of this happening, where feminist roboticists and designers are thinking about how to design these devices with dignity and respect, and with the capacity to shut down if abused, or let people know when they’re behaving inappropriately.
Other proposals include looking at ways to break the masculine association between “boys and their toys” in the smart home industry and in the way it is presumed who is interested in these technologies. And of course we also look at how the industry can be transformed, not just by encouraging more women in technology, but also by accelerating the inclusion of other social science disciplines in the design and programming of smart wives, which are not just advanced technologies but also ways of intervening in societies.
YS: Another suggestion we make is to focus on what we already have, which is smart wives, and try to broaden out and diversify what they do, how they behave, how they respond, what kind of personalities they have, so we don’t just have this one fairly uniform form of femininity being reproduced en masse across the entire world. Because that’s really problematic. In the book, we talk about this in terms of queering the smart wife. Look at it this way, we wouldn’t expect to only engage with one type of person or personality in our lives, so why should we expect that of our digital voice assistants, AI and robots?
We need some new ideas, some new imagination, some new innovation. Another one of our proposals is to look to sci fi as a source of inspiration for smart wives. There’s already a clear link between the smart wives we see on screen and those we have in our homes. But both follow fairly typical plot lines and often end up occupying service-oriented or 1950s-esque roles, with a variety of glitches or personality flaws blamed on these artificial women. Because there’s such a clear link between what we see on screen, and what we see in popular culture, and what we now actually see in our homes, we need to be seeing some variation. Inspired by the Geena Davis Institute “If she can see it she can be it” campaign, which is about improving the representation of real-life women and girls on screen, we propose something similar for the smart wife. The idea being that if we can see more diverse plot lines and personalities for these digital women on screen, then that will also inspire new design directions for the varieties we have in our homes.
Some of our other proposals are more practical, like changing the way we talk about smart wives in the media. Instead of headlines that blame feminized AI for their glitches, we suggest talking about these devices’ problems in relation to the people who are making them. So a headline like “Siri needs to wash her mouth out with soap and water” (referring to a racist response Siri was giving at one time) gets turned around to something like “Apple’s algorithms result in Siri making racial slurs”.