miércoles, 18 de julio de 2018

As brain extracts meaning from vision, study tracks progression of processing

Here’s the neuroscience of a neglected banana (and a lot of other things in daily life): Whenever you look at its color — green in the store, then yellow, and eventually brown on your countertop — your mind categorizes it as unripe, ripe, and then spoiled. A new study that tracked how the brain turns simple sensory inputs, such as “green,” into meaningful categories, such as “unripe,” shows that the information follows a progression through many regions of the cortex, and not exactly in the way many neuroscientists would predict.

The study, led by researchers at MIT’s Picower Institute for Learning and Memory, undermines the classic belief that separate cortical regions play distinct roles. Instead, as animals in the lab refined what they saw down to a specific understanding relevant to behavior, brain cells in each of six cortical regions operated along a continuum between sensory processing and categorization. To be sure, general patterns were evident for each region, but activity associated with categorization was shared surprisingly widely, say the authors of the study published in the Proceedings of the National Academy of Science.

“The cortex is not modular,” says Earl Miller, Picower Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT. “Different parts of the cortex emphasize different things and do different types of processing, but it is more of a matter of emphasis. It’s a blend and a transition from one to the other. This extends up to higher cognition.”

The study not only refines neuroscientists’ understanding of a core capability of cognition, it also could inform psychiatrist’s understanding of disorders in which categorization judgements are atypical, such as schizophrenia and autism spectrum disorders, the authors said.

Scott Brincat, a research scientist in Miller’s Picower lab, and Markus Siegel, principal investigator at the University of Tübingen in Germany, are the study’s co-lead authors. Tübingen postdoc Constantin von Nicolai is a co-author.

From seeing to judging

In the research, animals played a simple game. They were presented with shapes that cued them to judge what came next — either a red or green color, or dots moving in an upward or downward direction. Based on the initial shape cue, the animals learned to glance left to indicate green or upward motion, or right to indicate red or downward.

Meanwhile the researchers were eavesdropping on the activity of hundreds of neurons in six regions across the cortex: prefrontal (PFC), posterior inferotemporal (PIT), lateral intraparietal (LIP), frontal eye fields (FEF), and visual areas MT and V4. The team analyzed the data, tracking each neuron’s activity over the course of the game to determine how much it participated in sensory vs. categorical work, accounting for the possibility that many neurons might well do at least a little of both. First they refined their analysis in a computer simulation, and then applied it to the actual neural data.

They found that while sensory processing was largely occurring where classic neuroscience would predict, most heavily in the MT and V4, categorization was surprisingly distributed. As expected the PFC led the way, but FEF, LIP and PIT often showed substantial categorization activity, too.

“Our findings suggest that, although brain regions are certainly specialized, they share a lot of information and functional similarities,” Siegel says. “Thus, our results suggest the brain should be thought of as a highly connected network of talkative related nodes, rather than as a set of highly specialized modules that only sparsely hand-off information to each other.”

The patterns of relative sensory and categorization activity varied by task, too. Few neuroscientists would be surprised that V4 cells were particularly active for color sensation while MT cells were active for sensing motion, but perhaps more interestingly, category signals were more widespread. For example, most of the areas were involved in in categorizing color, including those traditional thought to be specialized for motion.

The scientists also note another key pattern. In their analysis they could discern the dimensionality of the information the neurons were processing, and found that sensory information processing was highly multi-dimensional (i.e. as if considering many different details of the visual input), while categorization activity involved much greater focus (i.e. as if just judging “upward” or “downward”).

Cognition in the cortex

The broad distribution of activity related to categorization, Miller speculates, might be a sign that when the brain has a goal (in this case to categorize), that needs to be represented broadly, even if the PFC might be where the judgement is made. It’s a bit like in a business where everyone from the CEO down to workers on the manufacturing floor benefit from understanding the point of the enterprise in doing their work.

Miller also says the study extends some prior results from his lab. In a previous study he showed that PFC neurons were able to conduct highly-multidimensional information processing, while in this study they were largely focused on just one dimension. The synthesis of the two lines of evidence may be that PFC neurons are able to accommodate whatever degree of dimensionality pursuing a goal requires. They are versatile in how versatile they should be.

Let all this sink in, the next time you consider the ripeness of a banana or any other time you have to extract meaning from something you perceive.

The work was supported by National Institute of Mental Health, European Research Council, and the Center for Integrative Neuroscience.



de MIT News https://ift.tt/2LtEekQ

No hay comentarios:

Publicar un comentario