California scientists have found a way to see through another person's eyes.
"This is a major leap toward reconstructing internal imagery," said Jack Gallant, professor of psychology and coauthor of a study published today in Current Biology. "We are opening a window into the movies in our minds."
Gallant's coauthors acted as study subjects, watching YouTube videos inside a magnetic resonance imaging machine for several hours at a time. The team then used the brain imaging data to develop a computer model that matched features of the videos -- like colors, shapes and movements -- with patterns of brain activity.
"Once we had this model built, we could read brain activity for that subject and run it backwards through the model to try to uncover what the viewer saw," said Gallant.
Subtle changes in blood flow to visual areas of the brain, measured by functional MRI, predicted what was on the screen at the time -- whether it was Steve Martin as Inspector Clouseau or an airplane. The reconstructed videos are blurry because they layer all the YouTube clips that matched the subject's brain activity pattern. The result is a haunting, almost dream-like version of the video as seen by the mind's eye.
The researchers say the technology could one day be used to broadcast imagery -- the scenes that play out inside our minds independent from vision.
"If you can decode movies people saw, you might be able to decode things in the brain that are movie-like but have no real-world analog, like dreams," Gallant said.
The brain activity measured in this study is just a fraction of the activity that lets us see moving images. Other, more complex areas help us interpret the content of those images -- distinguish faces from lifeless objects, for example.
"The brain isn't just one big blob of tissue. It actually consists of dozens, even hundreds of modules, each of which does a different thing," said Gallant. "We hope to look at more visual modules, and try to build models for every single part of visual system."
More models, Gallant said, mean better resolution. It also means a ton more data to analyze.
"We need really big computers," Gallant said.
Shinji Nishimoto, a neuroscientist in Gallant's lab and the study's lead author, said the results shed light on how the brain understands and processes visual experiences.
"We need to know how the brain works in naturalistic conditions," Nishimoto said in a statement. "For that, we need to first understand how the brain works while we are watching movies."
Whether the technology could also be used to watch people's dreams or memories -- even intentions -- depends on how close those abstract visual experiences are to the real thing.
"We simply don't know at this point. But it's our next line of research," said Gallant.
If the technology could be used to broadcast imagery, it could one day allow people who are paralyzed to control their environment by imagining sequences of movements. Already, brain waves recorded through electrodes on the scalp can flip a switch, allowing people with Lou Gehrig's disease and other paralyzing conditions to choose letters on a computer monitor and communicate.
Gallant and his team are often asked whether the technology could be used in detective work or court cases -- an idea that brings to mind the futuristic crime-foiling action in "Minority Report."
But the potential to watch a person's memories may not be so far off. Whether such memories could be used in a court of law, however, would be limited not only by the technology but also the nature of memories. After all, Gallant's website reads, an accurate read-out of a faulty memory only provides misleading information.