Can We Learn Empathy from Robots?

Many people familiar with science fiction tend to have an ingrained fear and repulsion at what are seen as cold and unfeeling robots.

The idea of widespread artificial intelligence brings to mind terrifying visions from films such as The Terminator or The Matrix, both of which present an apocalyptic future where artificial intelligence turns on mankind with disastrous results. The basic concern seems to be that robots lack any sense of empathy towards their human creators.

However, many humans already struggle with empathy, and this problem is especially poignant in the field of medicine. Unfortunately, many patients struggle to effectively communicate their pain to doctors, the very people who are able to treat it. Granted, pain is a difficult thing to communicate, but there is some evidence that doctors are even worse at recognizing it than the general population.

This may be born out of necessity, as medical professionals are required to distance themselves emotionally from patients in order to conduct treatments in a scientific and objective fashion. That said, it creates problems in trying to understand and diagnose pain conditions.

Dr. Laurel Riek, a professor of computer science at the University of California, San Diego, actually sought to test whether doctors could properly recognize emotional expressions in their patients. In fact, when medical experts and laypeople were exposed to digitally simulated facial expressions, the clinicians proved to be much less accurate at recognizing pain.

While the study analyzed various emotions, including anger and disgust, recognition of pain represented the starkest disparity between the groups. Only 54 percent of medical professionals successfully identified pain as opposed to an 83 percent success rate for laypeople.

This experiment managed to simulate facial expressions, not from images of actual humans, but from computer generated imagery and an actual robot. This robot was created by analyzing a vast video archive depicting human expressions and using face-tracking software to graft those expressions onto the uncannily realistic rubber face of the robot, named Philip K. Dick.

Now, Dr. Riek is trying to use robots like Philip K. Dick to teach doctors how to better understand emotion. There is some precedent for this, as clinicians have often used robots as practice dummies for learning medicine.

But she has pointed out a major flaw in the use of these robotic training tools: “These robots can bleed, breathe, and react to medication… They are incredible, but there is a major design flaw – their face.” She explains that facial expressions are critical in communicating pain to doctors, not just in interacting with the patient but also in quickly diagnosing strokes or adverse reactions to medication.

This entire enterprise may strike many readers as highly ironic, given the cold, calculated image that science fiction has given us for artificial intelligence. Even the robot’s namesake was a prolific writer who dealt with the problem of robots’ lack of empathy. However, Dr. Riek’s work demonstrates how many varied applications such a powerful technology can have on better understanding emotions and facial expressions.

For more research on empathy and facial recognition, check out our past blogs here and here.

Empathy and Facial Expressions

Do you think you’re good at reading other people’s facial expressions? You might be surprised!

While facial expressions provide a key insight into the emotions of other individuals, empathy may be even more important for laypeople to understand the emotions of fellow humans.

In a new study, Dr. Haotian Zhou from Shanghai Tech University and Professor Nicholas Epley from the University of Chicago asked a series of participants, let’s call them Group A, to review a series of emotionally charged photographs. Some photographs displayed depressing images, while others were cheerful or idyllic.

These participants were then asked to write down their emotional reactions to each photograph, and their faces were recorded with a video camera. Then, additional groups of participants were brought in to review these records, attempting to properly identify Group A’s emotional reactions.

These subsequent participants were divided into three categories. The first practiced “theorization,” and sought to determine Group A’s emotional reactions based on facial observation. They watched the video camera footage but were not told what the Group A participants were viewing.

This “theorization” cohort was contrasted with a group of participants who tried to identify emotional reactions based on the photographs alone, without exposure to the recorded expressions. This required them to empathize with how other individuals would feel in that situation.

Interestingly, the “simulation” group’s efforts proved to more effectively identify Group A’s expressions. In fact, the advantages of the simulation model were so striking that additional participants with access to both the video footage and the photographs were no more accurate than those that just examined the photographs.

While it may seem obvious to some of you that empathy would be a powerful tool in understanding other people’s emotions, most participants seemed to underestimate this potential. When given a choice between approaches, only a minority of participants selected the “simulation” method.

The study also examined whether it would be advantageous for participants to compare images of their own facial expressions with members of Group A. Hypothetically, this could have allowed them to better understand Group A’s emotions from video camera footage by comparing their own expressions with emotions. That said, those participants were no more accurate than other groups.

While this study underscores the power that empathy can have in promoting interpersonal understanding, we do not always have the ability to simulate other people’s experiences, as stimuli vary wildly beyond simple reactions to photographs.

It is at those times where micro expression reading is critical, but this study shows how difficult that can be without proper training. While the participants in this study were untrained, it would be revealing to see how Humintell staff, or those trained by Humintell, could have performed.

For more information on developing this skill click here.

Reading Those Puppy Dog Eyes

While we have often discussed how universal emotional expressions are, emerging research is expanding this universality even beyond our own species!

A recent 2017 study from the University of Helsinki sought to better understand how humans recognize emotions and facial expressions in dogs. The study found that, not only can humans effectively read canine expressions, but many only had to rely on basic human empathy to do so.

While it seems intuitive that humans with long-term experience living with dogs can learn to read their facial expressions, this study went further, finding that previous experiences with dogs were only a secondary factor.

Instead, the ability to empathize in general proved to be an effect method for understanding canine facial expressions. That said, participants with previous experiences with dogs were better able to understand other aspects of body language, such as posture or tail movements.

This research built on previous work that explored our capacity to read canine expressions. In a 2013 study, researchers at the Walden University in Florida showed human participants images of a dog displaying various emotions, including happiness, fear, sadness, anger, and disgust. Long-term followers of this blog might notice a telling overlap with the seven basic emotions.

While participants often had trouble identifying sadness and disgust, almost half were able to recognize fear in the dog’s face. Surprisingly, 88 percent properly identified happiness, including those with little previous experience with dogs.

This study helped establish our ability to read canine emotions, and the more recent study from the University of Helsinki demonstrated that this ability is rooted in facial recognition, not unlike our ability to recognize emotions in fellow humans.

Perhaps more surprisingly, it isn’t just humans that can read dog emotions. Additional research has also found that they are quite good at reading ours!

For example, a 2016 study out of the University of Lincoln, exposed dogs to a series of images displaying human facial expressions. They juxtaposed these images with audio clips of humans expressing similar emotions through voice commands. Sometimes they matched the audio and visual cues to present the same emotion, while often they exposed the dog to conflicting emotions.

Their research found that dogs showed a marked increase in attentiveness and interest when the audio and visual cues displayed the same emotion. This suggested that they had the ability to recognize human emotions, from both our facial expressions and our voices.

Concurrent research, again at the University of Helsinki, came to a similar conclusion. A 2016 study tracked the eyes of dogs that sought to read human faces, finding that they focus primarily on our eyes and responded quickly to expressions of anger.

These methods of inquiry help bridge the gap between human and animal emotions. This does more than understand interspecies interactions. In fact, by comparing forms of facial or emotional recognition, we can better understand the nuances of our own, human capacities.

For more information on animal emotions, see our past posts here and here.

Copyright © Humintell 2009-2017