We recently wrote a blog post on a TED Talk entitled “This App Knows How You Feel”.
Despite their powerful computing capability, our screens have no way of knowing how we feel. Computer scientist Rana el Kaliouby says that’s about to change. She is on a mission to bring emotional intelligence to our digital devices and is the chief science officer and co-founder of Affectiva, an MIT Media Lab spin-off.
She leads the company’s emotion analytics team, which develops emotion-sensing algorithms and mines the world’s largest emotion database. The team has collected 12 billion emotion data points from 2.9 million face videos submitted by volunteers in 75 countries.
The company’s platform is used by many Fortune Global 100 companies to measure consumer engagement.
The world might seem a little grayer than usual when we’re down in the dumps and we often talk about “feeling blue” — new research suggests that the associations we make between emotion and color go beyond mere metaphor. The results of two studies indicate that feeling sadness may actually change how we perceive color. Specifically, researchers found that participants who were induced to feel sad were less accurate in identifying colors on the blue-yellow axis than those who were led to feel amused or emotionally neutral.
The research is published in Psychological Science, a journal of the Association for Psychological Science.
“Our results show that mood and emotion can affect how we see the world around us,” says psychology researcher Christopher Thorstenson of the University of Rochester, first author on the research. “Our work advances the study of perception by showing that sadness specifically impairs basic visual processes that are involved in perceiving color.”
Previous studies have shown that emotion can influence various visual processes, and some work has even indicated a link between depressed mood and reduced sensitivity to visual contrast. Because contrast sensitivity is a basic visual process involved in color perception, Thorstenson and co-authors Adam Pazda and Andrew Elliot wondered whether there might be a specific link between sadness and our ability to perceive color.
“We were already deeply familiar with how often people use color terms to describe common phenomena, like mood, even when these concepts seem unrelated,” says Thorstenson. “We thought that maybe a reason these metaphors emerge was because there really was a connection between mood and perceiving colors in a different way.”
In one study, the researchers had 127 undergraduate participants watch an emotional film clip and then complete a visual judgment task. The participants were randomly assigned to watch an animated film clip intended to induce sadness or a standup comedy clip intended to induce amusement. The emotional effects of the two clips had been validated in previous studies and the researchers confirmed that they produced the intended emotions for participants in this study.
After watching the video clip, the participants were then shown 48 consecutive, desaturated color patches and were asked to indicate whether each patch was red, yellow, green, or blue.
The results showed that participants who watched the sadness video clip were less accurate in identifying colors than participants who watched the amusing clip, but only for color patches that were on the blue-yellow axis. They showed no difference in accuracy for colors on the red-green axis.
And a second study with 130 undergrad participants showed the same effect in comparison to a neutral film clip: Participants who watched a sad clip were less accurate in identifying colors on the blue-yellow spectrum than those who watched a neutral screensaver. The findings suggest that sadness is specifically responsible for the differences in color perception.
The results cannot be explained by differences in participants’ level of effort, attention, or engagement with the task, as color perception was only impaired on the blue-yellow axis.
“We were surprised by how specific the effect was, that color was only impaired along the blue-yellow axis,” says Thorstenson. “We did not predict this specific finding, although it might give us a clue to the reason for the effect in neurotransmitter functioning.”
The researchers note that previous work has specifically linked color perception on the blue-yellow axis with the neurotransmitter dopamine.
Thorstenson points out that this research charts new territory, and that follow-up studies are essential to fully understanding the relationship between emotion and color perception:
“This is new work and we need to take time to determine the robustness and generalizability of this phenomenon before making links to application,” he concludes.
Eyebrows are prominent human features, but what purpose do they serve? Scientists think they help keep stuff out of our eyes and aid in nonverbal communication, among other things. Learn more about eyebrows in this episode.
New research suggests that there is a link between being a good liar and having a full bladder. Yes, you heard that right.
In the paper entitled “The inhibitory spillover effect: Controlling the bladder makes better liars” Iris Blandon-Gitlin, et al loaded up subjects with different amounts of water and had them lie to interviewers. They found that the more water the subjects were holding in, the more convincingly they lied.
The study, which will be published in December’s issue of Consciousness and Cognition was recently discussed in a Popular Science article:
“In general, it’s much easier to tell the truth than to lie. If you really believe something, your brain just has to recount it. Lying means understanding what to change, making up new versions, controlling anxious behavior, monitoring your listener’s belief, and adjusting if they don’t believe you. “Lying is a very difficult task,” Iris Blandon-Gitlin tells PopSci. She studies the psychology of lying and co-authored the study. “You have to juggle a lot of information.”
Blandon-Gitlin said the strain of a full bladder might help someone lie better by activating the inhibition control centers of the brain. To lie, a brain has to inhibit the urge to tell the truth. If it’s already inhibiting the urge to urinate, it might be easier to lie.”
The study involved subjects who drank either five glasses or five sips of water and sat around for 45 minutes while their bladder filled up. They then had to lie about one of their strongly held opinions on a social issue in front of an interviewer and a camera, as well as giving a truthful statement. Later, two groups of observers watched the footage. One group rated the body language and confidence of the subject and the other tried to determine which statement was the truth and which was the lie.
The results? “Subjects that had a full bladder told more complex lies more convincingly and comfortably. It was harder for observers to realize the subject was lying; they only correctly identified a lie 30 percent of the time as opposed to a 70 percent success rate for the truth.
This result adds evidence to something called the inhibitory spillover effect. If you are already using one type of self-control, it’s easier to be self-controlled at other things. However, this only works for simultaneous tasks. Other research has found that resisting the cupcakes at the office can deplete your inhibitory control resources, making it harder to ignore that post-dinner ice cream siren song.
Blandon-Gitlin’s research is aimed at understanding lying. She wants to be able to accurately tell who is lying and who is telling the truth, and to do that, she has to understand the tricks that make liars seem more convincing. “We need to look at potential treats to conventional lie detection measures,” she said.”
What do you think of the findings of this study? Do they resonate with you?
Dr. David Matsumoto, world authority in interpreting nonverbal communication, inaugurated the second phase of the Training Course Evaluation Specialist testimonies and interviews organized by the Ministry of Security and Interpol and addressed to members of police forces Latin America.
The course objective was to strengthen skills and abilities of police interviews have specialist teams to evaluate, analyze and interpret statements from victims, witnesses and defendants, by interpreting the nonverbal behavior of human beings and linguistic analysis of the verbalized expressions.
Dr. David Matsumoto will be responsible for the training of participants. This day was accompanied by the Head of the Regional Bureau of Interpol, Rafael Peña and the National Director of Regional and International Cooperation of the Ministry of Security, Diego Llumà.
“Dr. Sergio Berni, who is a member of Interpol for the Americas, gave us the support to carry out this project and today we have the opportunity to have Dr. David Matsumoto, which is one of the most important personalities in the world this subject, “said Rafael Pena, adding:” The basic idea is that the police incorporate special skills to interpret, through facial and body gestures and signs to help guide both the investigator and the prosecutors and the judges. ”
The first phase of the course was opened by Security Secretary Sergio Berni, in early July this year. At that time, the training sessions were given by the anthropologist (UBA) and Doctor of Science in Social Communication (Austral University) Sergio Rulicki.
For our Spanish speaking readers: