by PHILIP BALL
Joseph-Benoit Guichard’s Dreams of love: “Maybe we won’t be able to script our dreams in detail—but it doesn’t seem impossible to imagine selecting their cast of characters”
In the movies—in films like Inception and Brazil—dreams look like, well, movies. But how much “visual” information do your dreams really contain? When you think about it, even familiar faces aren’t exactly “seen” so much as “sketched”— you know who they are because it’s your dream after all, not because you necessarily picture them in every detail. Do you actually “hear” what they say, or just somehow “know” it? Besides, dreams aren’t only or even primarily visual and aural. Often what strikes us most about a dream is the emotional aura, whether that’s fear, excitement or whatever. How could that ever be recorded in a “dream” home movie?
…
This kind of “black-box” (perhaps here a grey box) approach has enabled researchers at the University of California at Berkeley, led by neuroscientist Jack Gallant, to reconstruct spooky sketches of what movies people are watching just by monitoring their brains using fMRI. Gallant claims that his work is “opening a window into the movies in our minds.” The Berkeley group began in 2008 by decoding still images. They recorded the fMRI signals in an area of the brain associated with the early stages of visual processing, while subjects looked at 1,750 different images. A mathematical procedure allowed them to crack the code linking a particular distribution of light and dark in the images to the corresponding fMRI signal in the brain. With this Rosetta stone, the researchers could then deduce, with an accuracy of typically between 70-90 per cent, which of 120 candidate images a subject was looking at purely from the fMRI data of his or her brain activity. “Our results suggest that it may soon be possible to reconstruct a picture of a person’s visual experience from measurements of brain activity alone”, they claimed. A Japanese team at the ATR Computational Neuroscience Laboratories in Kyoto, led by Yukiyasu Kamitani, reported a similar result around the same time.
Prospect for more