When your boss strolls up to your desk at 5 p.m. on a Friday and asks you to work on Saturday, your facial expression tells the whole story. And, according to a new study from researchers at Ohio State University, no matter if your boss comes from Nigeria, Nepal or Nebraska, the look on your face will still come across loud and universally clear.
How Many Ways to Say No?
The study, led by Aleix Martinez, a professor of electrical and computer engineering at OSU, looked at the facial expressions of 158 students with a range of native languages as they expressed “I don’t want to.”
Speakers of English, Spanish, Mandarin Chinese and American Sign Language (ASL) were filmed while reciting a sentence with a negative valence, or responding to a question that they were likely to disagree with. The researchers manually selected the telltale signs of what they called the “Not Face” — furrowed brows, raised chin and compressed lips — from the images and set a computer algorithm to work sorting out “Not Faces” from others. They published their results Monday in the journal Cognition.
The Universally Understood ‘Not Face’
They found that the “Not Faces” appeared with the same frequency as spoken syllables, indicating that it was a genuine mode of communication, as opposed to a random occurrence. What’s more, the expression translated almost perfectly across languages, implying that the genesis of this particular expression extends far back into the past. While our words may differentiate us, our expressions remain a global unifier.
Martinez has done research into facial expressions before. In a 2014 study, he categorized 21 unique emotions, including “happily disgusted,” and “sadly angry,” for use in cognitive analysis. The new research builds on his previous findings by definitively linking a facial expression to language. While most of us recognize nonverbal modifiers with ease, proving that one of these modifiers exists across cultures and languages will allow for more accurate facial recognition software, as well as insights into the beginnings of communication and language.
Words and sentences make up only a part of human communication — anyone who has ever succeeded in obtaining directions in a foreign country by sole use of hand movements can attest. These arm-flailing conversations may look ridiculous, but they nevertheless succeed in getting the basic concept across. Even in normal conversation, our faces and bodies convey subtle shades of nuance that can add up to distinctly alter the meaning of a sentence.
Crucial for Sign Language
In certain languages, the unspoken cues hold much more significance. Sign language, for example, is based off of hand and body movements, but also relies heavily on a diverse array of facial expressions. For proof, look no further than ASL translator Lydia Callis, who became an Internet sensation during Hurricane Sandy for her virtuosic use of facial expressions while signing about the impending storm.
In his study, Martinez found that ASL users also deploy the “Not Face,” but do so to even greater effect than verbal language users. While those speaking English, Spanish and Chinese used the expression to strengthen the stated emotion, ASL users would replace the sign for “not” entirely, using only the “Not Face” to convey the same statement.
Martinez says that this is the first documented instance of ASL speakers completely replacing a word with a facial expression. Such a discovery highlights the crucial role facial expressions play in fully communicating how we feel to others.
Martinez hopes to expand his library of faces by teaching computer algorithms to recognize different expressions without the need for manual selection. Once they have that ability, he plans to use thousands of hours of YouTube videos to train them and hopefully compile a database of human expressions.
Such a database of expressions might be of interest to robots like Sophia, whose accurate but still creepy impressions made headlines at this year’s SXSW.