A Computer That Can Read Your Mind

Facial Recognition Program Makes For Better Virtual Teachers

By LEE DYE

July 7, 2008 —

One of these days your computer will probably know what you are thinking before you know it yourself. The human face conveys emotions ranging from fear to confusion to lying, sometimes involuntarily, and scientists are figuring out how to make use of those expressions.

At the University of California at San Diego, for example, a graduate student has developed a program that will slow down or speed up a video based entirely on changes in his facial expressions, like a slight frown, or a smile. The purpose of this particular program is to make robotic instructors more responsive to their student's needs, but there are many other potential applications for the work.

"The project I'm working on is how can we use machine perception, including things like facial expressions, to improve interactivity between students and teachers," said Jacob Whitehill, a computer science doctoral candidate. "That includes human teachers, and also robotic teachers, which is something our lab is increasingly interested in."

Whitehill has tested his technology on several other students, using software developed in the university's machine perception laboratory, called the Computer Expression Recognition Toolbox, or CERT. The machine got it right nearly half the time, slowing down a video of a lecture if the student was having trouble following it, and speeding it up if the student was swimming along.

"While these results show room for improvement, they already demonstrate the value of automatic real-time facial expression recognition in intelligent tutoring systems," Whitehill said in a paper describing his research.

In an interview he described how the system works.

A student sits in front of a computer as a webcam captures images of the student's face. The images are fed through a series of filters to keep useful information and remove useless information. The machine has already been programmed to recognize certain expressions, "like a frown, or a smile or a nose wrinkle."

When it worked right, the video slowed down if the student showed signs of confusion, and sped up if the expression indicated that the lecture was being absorbed.

The most common expression detected by the program was the rate of eye blinking. When students found the lecture difficult, or challenging, the eye-blink rate decreased, which is consistent with various psychological studies showing that the rate drops when the mental load is high.

"What I want to know is how fast, or how slow, the student wants the material presented," Whitehill said. Automatic expression detection has the advantage of revealing emotions that even the student may not be aware of, or unwilling to express.

That has significant implications for other possible applications for this line of research.

Humans have no control over certain facial muscles that can reveal what's really going on inside the brain, like whether the participant is telling the truth. Although this particular project is directed at improving the student-teacher relationship, a program that could detect lying would be useful in everything from criminology to interrogations of suspected terrorists, where the passage of time could be extremely important.

Whitehill said it could also be useful in "differentiating between whether humans feel real pain or whether they are faking a pain." He's working on that project now.

It's also possible, he said, that it could help clinicians treat depression, a particularly insidious disease. Changes in expressions over a long period of time could suggest whether a drug was having the desired effect, for example.

This field is still pretty much in its infancy, and UCSD is only one of several institutions pioneering in the research. Whitehill and his colleagues are already planning on branching into other areas of machine recognition.

"We're interested in looking at signals beyond facial expression," he said, like body posture, head shakes and head nods and eye gaze, to monitor channels of communication between a student and a teacher, especially if the teacher is a robot. That's becoming increasingly common in schools around the world, so this is one small step toward making robots more like the real deal.

And who knows, maybe one of these days you won't even have to pick up the remote to control your TV set. All you'll have to do is smile to turn it on. Frown, and it will switch the channel to ABC.

Lee Dye is a former science writer for the Los Angeles Times. He now lives in Juneau, Alaska.