Subtle changes in blood flow color around the face reveal the mood we’re in, even before our faces move to form the expression we want, according to a groundbreaking study by The Ohio State University.
This is the first study to ever document connections between blood flow color change and facial expression without facial movement. The researchers found that people are able to correctly identify the expression of another person up to 75 percent of the time, based only on shifts in blood flow color around areas of the face, including the cheeks, chin, eyebrows, and nose.
The paper is published in the Proceedings of the National Academy of Sciences.
A New Hypothesis
To begin this study, researchers took hundreds of photos of human expressions — happiness, sadness, anger, disgust, surprise and more — and looked at the scientific process behind facial movement.
Upon observing these expressions, the researchers noted a large number of blood vessels located between the skin and facial muscles. This led to the hypothesis that before the face even moves to form an expression, humans visually communicate emotion through subtle changes in blood flow color.
The researchers further developed this hypothesis by asking two questions:
- Are there color changes that are unique to each emotional expression?
- Are these color changes visible to other humans?
Testing the First Question
To test the first part of the hypothesis, the researchers broke down hundreds of facial expression images by separating them into two color channels, a red-green channel and a blue-yellow channel. Then, via computer analysis, they found that different facial expressions formed unique color patterns.
“There’s a little bit of every color everywhere,” Aleix Martinez, cognitive scientist and professor of electrical and computer engineering at OSU, said in a statement.
The researchers found that touches of red, green, blue, and yellow were found in every emotion, but organized in different amounts or locations around the face. For example, they found that disgust creates a blue-yellow cast around the lips and a red-green cast around the nose and forehead, while happiness is usually conveyed with red cheeks and temples and a little blue around the chin.
“Previous research has studied the hypothesis that facial muscle movements transmit emotion,” said Martinez. “In the present work, we study the novel hypothesis that blood flow changes, observed as variations in facial color, also transmit emotion. We show results in favor of this hypothesis. We also show that these two signals (facial muscle movements and color) are at least partially independent from one another.”
Testing the Second Question
After finding strong results from the first part of the hypothesis, the researchers set out to test whether or not these color changes are enough for people to identify a distinct facial expression.
To test this question, the researchers superimposed different color channels on pictures of faces with neutral expressions. They then showed these photos to 20 participants and asked them to identify the expression each person was conveying, by choosing from a list of 18 emotions. The emotions included both basic expressions like “happy” or “angry,” as well as compound expressions such as “happily surprised” or “sadly angry.”
The researchers found that the participants guessed right most of the time. In the neutral photos colorized to look happy, the researchers guessed the expression correctly 70 percent of the time. In addition, the participants guessed sadness correctly 75 percent of the time and anger correctly 65 percent of the time.
This proved that participants were able to successfully identify a person’s emotion without ever seeing the face move.
In a second experiment, the researchers showed the participants facial expressions of happiness, sadness, and other emotions, and put incorrect color formulations on top of them. For example, in some photos the researchers superimposed happy colors atop an image of person whose facial expression indicated sadness, and vice versa. In this experiment, participants noticed that something about the images looked wrong, even if they couldn’t quite identify why.
“Participants could clearly identify which images had the congruent versus the incongruent colors,” Martinez said in a statement.
Computers vs. Humans
In addition to testing this theory on human participants, the researchers tested the neutral-faced images on a computer using emotion-identifying algorithms that they developed.
The computer scored a bit higher than humans, detecting happiness correctly 90 percent of the time and the compound emotion “happily surprised” correctly 85 percent of the time. In addition, the computer detected anger correctly 80 percent of the time, sadness 75 percent of the time, fear 70 percent of the time, and “fearfully disgusted” 65 percent of the time.
Real World Applications
The researchers are patenting their computer algorithms, and hope they will inform research in the development of artificial intelligence (AI), computer science, cognition, and neuroscience.
They have also formed Online Emotion, a spin-off company, so they can commercialize the research.
Martinez believes that many AI applications can be developed if computers are able to consistently detect human emotions.
“There are many AI applications, e.g.: monitoring patients for emotional distress in a clinic or hospital, monitoring pre-language children or people that cannot communicate verbally, robots that can recognize our emotions as other humans do, apps that can edit your photos to make people look happier, sadder, etc., smart video games, and so on,” he said.