A Computer That Can Read Your Mind

One of these days your computer will probably know what you are thinking before you know it yourself. The human face conveys emotions ranging from fear to confusion to lying, sometimes involuntarily, and scientists are figuring out how to make use of those expressions.

At the University of California at San Diego, for example, a graduate student has developed a program that will slow down or speed up a video based entirely on changes in his facial expressions, like a slight frown, or a smile. The purpose of this particular program is to make robotic instructors more responsive to their student's needs, but there are many other potential applications for the work.

"The project I'm working on is how can we use machine perception, including things like facial expressions, to improve interactivity between students and teachers," said Jacob Whitehill, a computer science doctoral candidate. "That includes human teachers, and also robotic teachers, which is something our lab is increasingly interested in."

Whitehill has tested his technology on several other students, using software developed in the university's machine perception laboratory, called the Computer Expression Recognition Toolbox, or CERT. The machine got it right nearly half the time, slowing down a video of a lecture if the student was having trouble following it, and speeding it up if the student was swimming along.

"While these results show room for improvement, they already demonstrate the value of automatic real-time facial expression recognition in intelligent tutoring systems," Whitehill said in a paper describing his research.

In an interview he described how the system works.

A student sits in front of a computer as a webcam captures images of the student's face. The images are fed through a series of filters to keep useful information and remove useless information. The machine has already been programmed to recognize certain expressions, "like a frown, or a smile or a nose wrinkle."

When it worked right, the video slowed down if the student showed signs of confusion, and sped up if the expression indicated that the lecture was being absorbed.

The most common expression detected by the program was the rate of eye blinking. When students found the lecture difficult, or challenging, the eye-blink rate decreased, which is consistent with various psychological studies showing that the rate drops when the mental load is high.

"What I want to know is how fast, or how slow, the student wants the material presented," Whitehill said. Automatic expression detection has the advantage of revealing emotions that even the student may not be aware of, or unwilling to express.

That has significant implications for other possible applications for this line of research.

Humans have no control over certain facial muscles that can reveal what's really going on inside the brain, like whether the participant is telling the truth. Although this particular project is directed at improving the student-teacher relationship, a program that could detect lying would be useful in everything from criminology to interrogations of suspected terrorists, where the passage of time could be extremely important.

Whitehill said it could also be useful in "differentiating between whether humans feel real pain or whether they are faking a pain." He's working on that project now.

Page
  • 1
  • |
  • 2
Join the Discussion
You are using an outdated version of Internet Explorer. Please click here to upgrade your browser in order to comment.
blog comments powered by Disqus
 
You Might Also Like...
See It, Share It
PHOTO: A home damaged by a landslide Friday, April 18, 2014 in Jackson, Wyo. is shown in this aerial image provided by Tributary Environmental.
Tributary Environmental/AP Photo
null
Danny Martindale/Getty Images
PHOTO: Woman who received lab-grown vagina says she now has normal life.
Metropolitan Autonomous University and Wake Forest Institute for Regenerative Medicine