New study from the Pillow lab uncovers rotating neural trajectories during decision-making

In October 2020, the Pillow lab published their study “Prefrontal cortex exhibits multidimensional dynamic encoding during decision-making” in Nature Neuroscience in collaboration with Valerio Mante (ETH Zurich). Led by Mikio Aoi, who is now an assistant professor at the University of California, San Diego, this study challenges the traditional model of perceptual decision-making. By using novel analysis tools, they show how the prefrontal cortex dynamically encodes and integrates information about an upcoming decision, opening new perspectives on how the brain turns sensory information into action.

READ MORE

Perceptual decision-making is the process of transforming sensory information into a decision. For example, while searching for a taxi, you would look at the traffic until you find a car that has the characteristics of a taxi and then raise your hand to signal for the car to stop (see Figure 1). In this example, the flow of cars provides a sensory input. However, the characteristics that differentiate a taxi from other cars are contextual. In New York, until recently, all taxis were yellow, no matter the shape. In contrast, taxis in London have a distinct shape, but may vary in color. Our brains must take this sensory input and decide whether to wave down the car or not, depending on the context. This type of decision is thought to rely on the prefrontal cortex.

To unravel how neural activity in the prefrontal cortex evolves during perceptual decision making, the authors revisited a data set published by Valerio Mante in Nature in 2013 (Mante et al. 2013). Valerio Mante and colleagues trained two monkeys to perform a context-based perceptual decision-making task (see Figure 1). The stimulus was composed of many dots that were moving more towards the right or the left (motion) and that were either in majority red or green (color). Monkeys would first see a context cue that would tell them which feature of the stimulus they should use to make the choice: color or motion. In the color context, monkeys had to choose the response that matched the color of the stimulus. In the motion context, they had to choose the target on the side of the screen the dots were going towards. While monkeys performed this task, neurons were recorded in the prefrontal cortex.

Instead of analyzing the activity of individual neurons, the authors propose to describe the activity of the population as a whole – the way the neurons act together – in several independent subspaces: sensory, context and choice. They then represent the trajectory of the neural activity in reduced 3-dimensional subspaces. In the traditional model of perceptual decision making, a population of neurons accumulate sensory evidence for or against a given action in the prefrontal cortex. This standard accumulation model predicts that the neural activity would be constrained to a line (See Figure 2, left panel). Using their novel method, the authors found that in each subspace, the trajectory of neural activity is not constrained to a line, and instead forms rings (See Figure 2 right panel). This type of rotation had been previously observed in the motor cortex by Mark Churchland and colleagues (Churchland et al. 2012, Nature) but never yet in the context of perceptual decision making.

The findings of authors Mikio Aoi, Valerio Mante and Jonathan Pillow uncover a rich picture of the perceptual decisions-making process. The dimensions of decision problems (city, color and shape of the cars, hand raised or not) are represented in separate neural subspaces. Sensory information and decisions are accumulated along one neural dimension and are then transformed through rotation into others. The functional significance of this rotation is an open question asked by the study. Their work furthers our understanding of how the brain encodes information during perceptual decision-making.

By Caroline Jahn

Jonathan Pillow