Shelly Ronen is a sociologist of gender, sexuality, technology and culture. She is currently a Visiting Assistant Professor at Haverford College. She received her PhD in sociology at New York University in 2018.
How do advertisers working on “pro bono” projects justify working for free? How do they understand the relation between paid and unpaid advertising campaign? Together with Iddo Tavory and Sonia Prelat (NYU), this book project, under contract with University of Chicago Press, explores the alignment and misalignment of various goods and the “moral privilege” they make possible.
Plastic Morality: The Politics of Sex Toys (2014-2018)
How did the formerly taboo sex toy industry become moral and mainstream? What do these material objects reveal about the contemporary state of intimacy and sexual behaviors? How do sex technologies reflect, reinforce or resist gender inequalities in access to sexual pleasure?
The dissertation project involves ethnography and interviews with designers and manufacturers of sex toys.
Superstorm Research Lab (2012-2013)
In the wake of Hurricane Sandy in 2012, Shelly joined with other researchers in forming Superstorm Research Lab (SRL), a mutual-aid research collective working to understand how NGO leaders, activists, volunteers and residents were thinking about social, environmental and economic issues.
Gender, Creativity and Design Work (2012 – 2014)
What does it mean to be a creative professional? How do personal and professional identities inhere in everyday experiences of work? How does the identity category differ for men and women? This project involved interviews with designers at multidisciplinary agencies in the US.
College Courtship and Public Sexuality (2006 – 2009)
How are university and college students experiencing sexual behavior and relationships? What differences are there between the experiences of men and women? Working for Paula England on her large study of college courtship, Shelly conducted interviews with undergraduates about their sex and relationship histories, and ethnography of college socializing.
Design Fiction (2014-)
Outside of NYU, Shelly is a part of an artistic collaboration with Ernesto D. Morales. Together they produce speculative design projects for the fictional design consultancy, Object Solutions. Together, they comment on visions of an optimized future:
With a dose of dark humor, we dissect everyday challenges and propose technological solutions for each point of imperfection.
The project has been exhibited at several galleries and arts institutions across the US, and Morales and Ronen have led performative workshops in which participants engage in generating specialized inventions for everyday life. See some of the inventions for Love, or buy the self-published Love Optimized volume.
Aviva Rutkin, writing for New Scientist wrote a piece last week prodding at the moral panic around sexbots and arguing that there is a substantial disparity between titillating reporting on sexbots and their actual manufacture, sale and future use. I really appreciate that she ends up connecting the viability of home robots with the moral and material valuations of different kinds of labor.
To get there, she takes us on a journey that begins with the dazzling appeal of reporting on sexbots. Not only are sex robots “recession proof” as objects – or so says sexbot maker Douglas Hines – but the headlines sexbots (and their moralistic condemnation) can garner are no doubt providing endless thrills for analytics-obsessed journalists.
Yet notwithstanding the dazzling appeal of reporting on sexbots, sex is not likely to be the only kind of home robot. Care robots are coming onto the market, albeit their emergence is rather slow, and their designs are embarrassingly somewhere between cuddly and uncanny.
But public opinion is not falling in step. People are still unsure about (or downright uncomfortable with) outsourcing care for the elderly and children to machinery. And how interesting that is, or perhaps it’s just absurd, given that this kind of work is some of the least respected and lowest paying work.
Why granny’s only robot will be a sex robot
Not Like Us is Aviva Rutkin’s monthly column exploring the minds of intelligent machines – and how we live with them
By Aviva Rutkin
Douglas Hines started out with what sounded like a nice idea.
In the early 2000s, the former Bell Labs engineer was busy caring for his elderly father and building his own technology business. That’s when he first came up with the idea for a companion robot: a machine that could look after his dad and keep him in touch with the outside world via webcam.
Hines started working on a prototype, but ran into trouble finding financial and legal support for the project. So he gave up, and instead turned his attentions to Roxxxy, a life-size sexbot dressed in filmy black lingerie (“always turned on and ready to talk or play!”). That gambit was far more successful. As Hines deadpanned in an interview with IEEE Spectrum in 2010, adult entertainment is “recession-proof”.
Hines’s story is a good allegory for the wider landscape of care robots: five years later, sexbots, though not yet exactly flying off shelves, have stoked enough cultural interest to inflame a widely covered campaign to ban them. Meanwhile, care robots for the elderly remain stuck in sociocultural purgatory. They’re the flying skateboards of the service industry: always predicted, always trotted out as an example of the future, perpetually just out of reach. It’s time to admit that the problem with this vision isn’t the technology. It’s us.
On the surface, the fates of sexbots and carebots should not be so divergent. Both are mechanised stand-ins for roles that are typically undervalued and ill-treated in society, with neither ethically straightforward to replace. Neither will work without a robot that can move around on its own and do some heavy lifting. Both would work even better with some level of social or emotional intelligence built in, to better respond to human needs.
Where are all the robots?
It’s especially curious that the carebot revolution has not taken place, in light of how direly we need it to. In the UK, the number of citizens over the age of 65 is expected to surge by 12 per cent by 2020; and the number of over-85s by 18 per cent. Reports have identified care for the elderly as one of the fastest-growing roles in healthcare.
It’s certainly not a lack of robots that’s causing the hold-up. A bevy of recent prototypes includes Toyota Research Labs’ Robear to lift people out of bed, wheelie bot Zenbo, which can call for help in an emergency, and the seal pup Paro, which takes on the emotional labour of fuzzy companionship. In a demo video for Robot-Era, a project recently piloted in Italy and Sweden, “friendly machines” pick up groceries and mail, relay video calls, take out the rubbish, provide reminders about medication, and take their owners’ arms as they stroll down the street.
But how well will these sell? Not very, if you believe surveys. It seems that people don’t like the idea of carebots looking after their vulnerable relatives. Of more than 25,000 people questioned in a 2012 survey of attitudes in the European Union, 60 per cent thought robots that care for children, the elderly and the disabled should be banned outright; and 86 per cent said they would be uncomfortable with one caring for their children or parents (though many more were OK with the idea of a robotic assistant and even a surgeon).
In a separate poll of people in the US, 65 per cent of respondents across all ages agreed that it would be a “change for the worse” if robots became the primary caregivers for the sick and elderly.
Why the squeamishness? We generally look forward to robots doing the chores for us, from answering emails to picking apples to defusing bombs, tirelessly, cheerfully, with uniform precision. (The word “robot”, in fact, is derived from the Czech word for forced labour.) It’s quite all right for a machine to carry out such demands, from the trivial to the tawdry.
On the surface, carebots look like mechanised butlers, too. However, in difficult moments they flip the script – asking us to relinquish control, human connection and our fantasies about ourselves.
Every day, carebots will run into hundreds of small moral dilemmas: their owner decides not to take today’s prescribed medication; she keeps leaving the stove on, or wandering out of the house and down a street heaving with traffic; or he commits a crime in full view of a watchful mechanical eye, as in the film Robot and Frank, in which an ageing thief recruits his carebot as an accomplice.
What mistakes will be acceptable, and which will be grounds for a recall? Will there be limits to a bot’s responsibilities? Or will their charges have to submit to their power?
In the paper “Granny and the Robots”, Amanda Sharkey and Noel Sharkey at the University of Sheffield, UK, point out another drawback to life with a robo-caretaker: it’s lonely. Putting a carebot in place of a human might deprive many of one of their few opportunities for regular social contact. Such isolation is linked to poorer health outcomes, such as a greater risk of developing Alzheimer’s disease or dementia. It could also make people feel plain dehumanised – ripped of their dignity, a vulnerable object to be lifted, fed or prompted at intervals.
“If the human rights of the elderly are to be respected as much as the rights of other members of society, it is important to ensure that robots introduced into elder care do actually benefit the elderly themselves, and are not just designed to reduce the care burden on the rest of society,” write Sharkey and Sharkey.
There’s another reason that carebots might not sit comfortably with us: they don’t jive with our flattering visions of ourselves. Looking after another human being is hard work. It’s physically and emotionally taxing, occasionally messy, and can be boring and thankless. It’s also among our lowest-paid jobs. There’s an expectation that this work is a kind of calling, performed out of love or a sense of service by a friend or family member, or at least a compassionate and conscientious worker.
The reality is a harsh departure from that ideal. In elderly care homes in the US, people are more likely than in the wider community to be subjected to emotional and physical abuse or neglect – one in 10, according to some reports.
No one looks forward to a carebot dystopia, in which machines exercise dubious moral power over people. But the alternative, too, can be discomfiting: robots turning out to actually be preferable to human aides. It doesn’t reflect too well on us if our future seniors opt to live in a non-human ghetto, with whatever glitches and lack of contact, over the prospect of abuse by bitter and angry staff.
“We need to think of automation as a political question,” said Lucy Suchman at Lancaster University, UK, speaking at a White House workshop on artificial intelligence in New York City on 7 July. “What grounds are there to believe that a robot can engage in the work of care?” Work like this is difficult for a machine to master because of its nature: heterogeneous, open-ended, and often reliant on the ability to interact with others.
Rather than jump to robotic substitutes, we could think of other ways to sate society’s growing need for workers who care for the elderly, such as revaluing the work involved. “The fact that you get paid a huge amount of money to write code and you get paid nothing to take care of people’s children is not a reflection of the relative skills,” said Suchman, “but rather a reflection of the valuation that we make of those jobs within a particular political economy.”
We should ask whether there are really not enough people to do those jobs, or whether it’s just that those roles have been devalued, she added.
The problem closely parallels the idea of using robots for childcare. New parents are expected to extol the joys of parenthood and gloss over the drudgery, even though the experience is a proven drag on personal happiness. Tireless devotion is considered a virtue, one that the vast majority of us cannot attain; leaving a child with just a human nanny carries an undeserved social stigma of neglect, even though for many it’s the only practical solution.
What would the neighbours say if they heard that little Jimmy was left with a machine while mum went out for a well-deserved drink? It may not be fair, but it’s not unimaginable. That’s a tough norm for a shiny new robot to break down.
Leaving a loved one in the care of a machine will look tantamount to admitting that we have other things we’d rather do – that all humans have things they’d rather do. Like, maybe, spend time with our new sexbot.
So while sex robots already have enough of a built-in audience that people are fighting over whether we’ll marry them or ban them, the future for care robots is looking a lot murkier. Unlike with sex robots, we don’t know what we want from them.