Whitehill said it could also be useful in "differentiating between whether humans feel real pain or whether they are faking a pain." He's working on that project now.
It's also possible, he said, that it could help clinicians treat depression, a particularly insidious disease. Changes in expressions over a long period of time could suggest whether a drug was having the desired effect, for example.
This field is still pretty much in its infancy, and UCSD is only one of several institutions pioneering in the research. Whitehill and his colleagues are already planning on branching into other areas of machine recognition.
"We're interested in looking at signals beyond facial expression," he said, like body posture, head shakes and head nods and eye gaze, to monitor channels of communication between a student and a teacher, especially if the teacher is a robot. That's becoming increasingly common in schools around the world, so this is one small step toward making robots more like the real deal.
And who knows, maybe one of these days you won't even have to pick up the remote to control your TV set. All you'll have to do is smile to turn it on. Frown, and it will switch the channel to ABC.
Lee Dye is a former science writer for the Los Angeles Times. He now lives in Juneau, Alaska.