The Flaming Lips sing about a robot named Unit 3000-21 that is able to duplicate emotion and seems to be “something more than a machine” to the people it comes into contact with. As it warms, blinks and hums, it begins inspiring a “synthetic kind of love” in its human keepers.
As it turns out, The Flaming Lips’ lyrics may someday apply to reality. For now, at least, humans can feel empathetic emotion for robots, according to research presented this week at the 63rd annual International Communication Association conference in London. While previous studies relied upon participant anecdotes to measure emotional connection to robots, this new research objectively confirmed it through physiological measurements.
Robot torture, disturbingly enough, helped reveal this finding. “It happened by chance that we got into this study,” explains Astrid Rosenthal-von der Putten, a social psychologist at the University of Duisburg-Essen in Germany and lead investigator of the new study. “We saw a video of this robot being punished by some guy online.”
Connecting with Pleo
The bizarre YouTube video, dubbed “Pleo R.I.P.,” depicts two men giddily torturing Ugobe’s Pleo robot, a small green toy that’s built in the shape of a baby dinosaur. As Pleo is smacked, shaken upside down, dropped and stuffed into a bag, it cries, whines and even makes choking sounds. Eventually, it stops working.
Strangely inspired, Rosenthal-von der Putten and her colleagues decided to use the video to test whether or not viewers experienced any emotion while watching the robot getting hammered. They recruited 40 volunteers to watch 10- to 20-second clips of Pleo either being tortured or being treated nicely. They attached skin conductance sensors to participants to measure moisture changes from sweat, which indicates physical arousal and is correlated with emotion. Most participants reported more negative feelings while watching the robot being tortured, and their heightened levels of physiological arousal corroborated this.
In a second experiment, the investigators prepared numerous 10-second video clips depicting either Pleo, a green box or a woman wearing a green T-shirt in a number of situations. In the videos, a man engaged in either friendly or abusive actions towards the woman, the robot or the box. In the abusive scenarios, he strangled them with a yellow rope, slammed them onto a table, or put a plastic bag over them. During these interactions, Pleo made its characteristic distress bleats and the woman scream or moaned. In the friendly interactions, on the other hand, the man tickled, stroked, caressed, hugged or massaged the woman and the objects. (Rosenthal-von der Putten assures that no one was harmed in the making of the films, except perhaps Pleo, whose tail was broken at one point.)
Fourteen volunteers watched the video clips while outfitted with a functional magnetic resonance imaging device to record their brain patterns. While watching the violent interactions, participants responded physiologically again, displaying neural activity in their limbic structures that indicates strong emotional reactions when both the human and robot—but not the box—were being tortured. In a hat tip to humanity, the woman’s agony inspired stronger emotional reactions than Pleo’s distress did.
Does robo-dino empathy translate to androids?
Pleo is a small green dinosaur, not a humanoid. Whether people would react more strongly to watching a human-like robot being harmed remains an open question that Rosenthal-von der Putten hopes to investigate if appropriate funding comes through (human-shaped robots for smacking around cost around 16,000 euros compared to Pleo’s 300 euros). The team is also interested to discover whether other forms of abuse like verbal insults and social ostracism will also make people feel empathy for robots.
Whether these results point to the possibility of a long-lasting human-robot bond also remains to be seen. “This is just an initial step to look into this process,” Rosenthal-von der Putten says. “There are still other high cognitive processes involved that might or might not lead to relationship building.”
As a start, though, the results do hint at two possibilities for the future. For one, if people do share the same emotional and neurological reactions when encountering both other humans and robots then we should probably think carefully about how we employ and use robots in the future. On the other hand, this emotional trigger could be exploited in facilitating effective future interactions between people and robots, such as strengthening bonds with helper robots that live with elderly individuals in their homes.
“The problem is that people are initially interested in the robot but then it gets boring,” Rosenthal-von der Putten explains. “It’s important to know how people react to these kinds of technologies just to find a way to make them more interesting in the long term and keep people engaged.”