SUNY researchers tease apart the processes that govern visual attention
Look around you. What do you see? Perhaps a cluttered desk, a living room full of household items, a busy streetscape? Visualize yourself searching that space for your car keys, the television remote control, or a speed-limit sign. Now imagine that you suffer from a head injury. You search and search for the object, but cannot find it. Your brain simply refuses to cooperate.
Greg Zelinsky, professor of cognitive science at Stony Brook University, and Robert McPeek, associate professor of biological sciences at SUNY College of Optometry, are collaborating on a project to tease apart what happens in a normal brain when a person searches his or her environment for an object. The information will be useful in helping people who have suffered from brain damage.
“People who’ve had head injuries often have problems with eye movement that can affect their ability to read, drive, or even just walk around,” said McPeek. “We’re trying to understand how eye movements work in the normal system first, and once we understand the normal system we can start to think about ways to help people who have damage.”
A Model of the Brain
The first step in understanding a normal brain, McPeek said, is to figure out how people decide where to look. “If you’re out there in the world there are a million things you could choose to look at, but somehow our eye movements are very precise,” he said. “They take our eyes to the right place at the right time so we can pick up the information that we need to do whatever it is we’re trying to do. It’s not very well understood how that happens.”
To unravel this mystery, McPeek and Zelinsky are zeroing in on the part of the brain that is responsible for eye movement—the superior colliculus. Their goal is to create a model that will predict where activity should occur in the brain during the process of searching a visually complex scene for an object.
“Basically, we’re using knowledge about a part of the brain important for eye movement to better predict where attention is directed in everyday tasks,” said Zelinsky.
According to Zelinsky, who is spearheading the model’s development, the model will use existing knowledge about the neurophysiology of the superior colliculus to better predict where fixations will be made when viewing visually-complex common objects. This model will also be used to predict the degree and distribution of activity in the superior colliculus in response to a subject viewing these complex visual displays while performing a task. These predictions will then be tested by McPeek, who is in charge of the neuroscience laboratory that will be making the neural recordings.
Generating Real-World Data
The experiment begins with a rhesus macaque monkey. McPeek currently is training the monkey how to search for an object, such as a teddy bear, in a realistic visual scene. “It doesn’t know exactly what the teddy bear looks like; there could be 10 different teddy bears,” he said. “The situation is similar to a person searching for a pencil on a cluttered desk.”
Next McPeek will use information generated by Zelinsky’s model to strategically attach electrodes to the monkey’s brain to record how frequently neurons fire around the electrodes when the monkey is asked to search for a particular item. He also will measure the monkey’s eye movements using an infrared camera, which precisely tracks where the eye is looking. The team will compare the results to the predictions of the model.
“If the results match the predictions of the model, then we have evidence that maybe we understand how this brain area is wired and how it’s used,” said McPeek. “If the lab data show us something different from what’s in the model, we’ll know we have to change some of the model’s functions.”
According to the researchers, previous studies of eye movement programming in the brain have used very simple stimuli, such as a dot on a blank screen. “Now we’re going from dots to car keys,” said McPeek.
Both researchers agree that the study is a major step toward gaining a basic understanding of brain function and will provide essential information needed to help people with vision problems.
The researchers received seed money from the SUNY Networks of Excellence to get started on their project. They aim to continue their work with additional outside funding and currently have a grant application pending at the U.S. National Science Foundation’s program on Collaborative Research in Computational Neuroscience.
comments powered by Disqus