If facial recognition software that can compare your features to a criminal database, or gather data for advertisers, wasn’t futuristic enough for you, consider this: Someday when you’re taking a class from a robot instructor, it might be able to tell how well you understand the material solely based on your facial expressions.
Jacob Whitehill, a computer science PhD student at the University of California, San Diego, has created software that would allow him to control how fast a video played just by moving his face. That was the first step, he says—showing that a computer could pick up on facial movements and, if it was programmed correctly, use those movements as instructions. You can check out video of his “smile detector” here.
But for a robot to read your face, it must first know what all your blinks and nods mean. Whitehill is now studying lots of faces of people who are trying to learn; he hopes to figure out the patterns of facial movement that correlate with understanding, or with the frustration that comes with not understanding. So he watched students who were viewing a video lecture, and recorded himself teaching his lab partners the finer points of German grammar. Some of what he’s learned is pretty obvious—you nod more when you understand—but some conclusions might not be—people blink less often during the harder parts of a lecture.
Someday, Whitehill hopes, the machines that teach us will be programmed to understand the ins and outs of human facial expression. If students look happy and engaged, the machine will keep the same speed; if the entire class displays blank stares and no signs of comprehension, it will know to slow down. Whether or not a robot teacher can determine between pupils who don’t understand and ones who just don’t care, however, remains to be seen.