Flashback Friday: How to tell if someone is really in pain or just faking it.

By Seriously Science | March 24, 2017 6:00 am
Photo: flickr/Sarah-Rose

Photo: flickr/Sarah-Rose

When it comes to reading people, scientific studies have revealed helpful strategies for situations ranging from playing poker and identifying gonorrhea-infected people by smell alone.  But this study might just prove even more useful. Here, researchers show that it is possible to distinguish between people who are faking pain and those who are actually experiencing it. And although people can be trained to improve their ability to tell the two apart, they have nothing on computer vision — apparently, when it comes to pain, computers are better at identifying when facial expressions are forced and when they are involuntary. Are we one step closer to a Torture Bot? Only time will tell…

Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions.

“In highly social species such as humans, faces have evolved to convey rich information for social interaction, including expressions of emotions and pain. Two motor pathways control facial movement: a subcortical extrapyramidal motor system drives spontaneous facial expressions of felt emotions, and a cortical pyramidal motor system controls voluntary facial expressions. The pyramidal system enables humans to simulate facial expressions of emotions not actually experienced. Their simulation is so successful that they can deceive most observers. However, machine vision may be able to distinguish deceptive facial signals from genuine facial signals by identifying the subtle differences between pyramidally and extrapyramidally driven movements. Here, we show that human observers could not discriminate real expressions of pain from faked expressions of pain better than chance, and after training human observers, we improved accuracy to a modest 55%. However, a computer vision system that automatically measures facial movements and performs pattern recognition on those movements attained 85% accuracy. The machine system’s superiority is attributable to its ability to differentiate the dynamics of genuine expressions from faked expressions. Thus, by revealing the dynamics of facial action through machine vision systems, our approach has the potential to elucidate behavioral fingerprints of neural control systems involved in emotional signaling.”

Related content:
NCBI ROFL: Classifying dogs’ facial expressions from photographs.
NCBI ROFL: Innocent until proven bearded.
NCBI ROFL: Botox makes you happy…because you have no other choice.

CATEGORIZED UNDER: feelings shmeelings
ADVERTISEMENT
  • http://www.mazepath.com/uncleal/qz4.htm Uncle Al

    An implant measures (e.g., plasma cortisol level) and impresses pain – overseen by central authority. Everybody outside management will experience local, state, national, world population average pain. The world will seek mutual pleasure not pain.

    Ranking EU bureaucrats are excluded from their countries’ of origin taxation while ever increasing EU taxation rates and scopes to finance social equity. How is that not a perfect solution toward government beneficence and social equality regardless of personal circumstance?

    • OWilson

      We need to wire up certain politicians to empirically test their claims of, “I feel your pain!”. :)

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Seriously, Science?

Seriously, Science?, formerly known as NCBI ROFL, is the brainchild of two prone-to-distraction biologists. We highlight the funniest, oddest, and just plain craziest research from the PubMed research database and beyond. Because nobody said serious science couldn't be silly!
Follow us on Twitter: @srslyscience.
Send us paper suggestions: srslyscience[at]gmail.com.
ADVERTISEMENT

See More

ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Collapse bottom bar
+