Scientists reveal insights into fundamental questions about sensory perception
Princeton, N.J. -- Concluding an unusual intellectual contest, a Princeton scientist has revealed the principles behind a computer model of a mouse brain capable of recognizing spoken words.
Neuroscientist John Hopfield created the brain simulation several months ago based on a theory he developed about how the brain interprets sensory perceptions, from touch to hearing. Hopfield did not, however, publish his insights immediately. Instead, he made the simulation available on a Web site and, in September, issued a challenge to colleagues to deduce the principle behind it.
"We wanted to provoke neuroscientists into thinking about thinking," said Hopfield, who developed the brain simulation and contest in collaboration with Carlos Brody of New York University.
Hopfield and Brody brought the contest to a conclusion on Dec. 14, announcing a first-place winner from Cambridge University and a second-place winner from the California Institute of Technology. The two described the principle in detail in a paper to be published later this winter in the Proceedings of the National Academy of Sciences. An uncorrected proof of the paper is being made available to journalists early to coincide with the end of the contest.
The question behind Hopfield's challenge is a critical one for neuroscience: How does the brain recognize patterns in the sensory inputs it receives. The problem is particularly difficult for inputs that arrive over a period of time, such as spoken words or the sensation of touching a familiar object.
Based on years of investigation, Hopfield concluded that performing such feats requires brain cells to be very sensitive to the timing with which they fire off electrical signals to one another. The conventional view has been that networks of neurons respond only to broad differences in firing patterns - a slow series of electric spikes
Contact: steven schultz