PASADENA, Calif--Contrary to what one might imagine, the way in which each of us interacts with the world is not a simple matter of seeing (or touching, or smelling) and then reacting. Even the best baseball hitter eyeing a fastball does not swing at what he sees. The neurons and neural connections that make up our sensory systems are far too slow for this to work. "Everything we sense is a little bit in the past," says Richard A. Andersen of the California Institute of Technology, who has now uncovered the trick the brain uses to get around this puzzling problem.
Work by Andersen, the James G. Boswell Professor of Neuroscience at Caltech, and his colleagues Grant Mulliken of MIT and Sam Musallam of McGill University, offers the first neural evidence that voluntary limb movements are guided by our brain's prediction of what will happen an instant into the future. "The brain is generating its own version of the world, a 'forward model,' which allows you to know where you actually are in real time. It takes the delays out of the system," Andersen says.
The research in Andersen's laboratory is focused on understanding the neurobiological underpinnings of brain processes, including the senses of sight, hearing, balance, and touch, and the neural mechanisms of action. The lab is working toward the development of implanted neural prosthetic devices that would serve as an interface between severely paralyzed individuals' brain signals and their artificial limbs--allowing thoughts to control movement.
Research along these lines conducted at the University of Pittsburgh and Carnegie Mellon University recently allowed monkeys to feed themselves using a robotic limb that they controlled only with their thoughts. Their thoughts were picked up via an array of electrodes sitting on top of the primary motor cortex, a lower level brain region responsible for carrying out motor functions.
Andersen's group focuses on a more high-level area of cortex called the posterior parietal cortex (PPC), which is where sensory stimuli are actually transformed into movement plans.
In their experiments, Andersen and his colleagues trained two monkeys to use a joystick to move a cursor on a computer screen from a small red circle into a green circle, while keeping their gaze fixed on the red circle. The monkeys typically generated curved trajectories, but to increase the curvature one monkey was trained to move the cursor around an obstacle. The obstacle (a large blue circle) was placed between the initial location of the cursor and the target circle, and the monkey had to guide the cursor around the obstacle, without touching it, and over to the green circle. As the monkeys conducted the tasks, electrodes measured the activity of neurons in the PPC. This allowed Andersen and his colleagues to monitor signals--commands for movement--in real time.
The studies showed that neurons in the PPC produce signals that represent the brain's estimation of the current and upcoming movement of the cursor. "An internal estimate of the current state of the cursor can be used immediately by the brain to rapidly correct a movement, avoiding having to rely entirely on late-arriving sensory information, which can result in slow and unstable control," Mulliken says.
"The idea is that you feed back the command you make for movement into those areas of the brain that plan the movement (i.e., the PPC)," Andersen says. "The signal about the movement taking place is adjusted to be perfectly aligned in time with the actual movement--what you're moving in your head matches with what you're moving in the real world." The effect is akin to an athlete visualizing his performance in his mind. Studies have previously shown that these simulations of movement trajectories run through the posterior parietal cortex, and run at actual speed, taking the same amount of time as the activity would in real life.
In the Pittsburgh robotic arm study, the neural signal driving the robotic limb was what is known as a "trajectory signal," which represents the path that must be taken to move from one point to another, like using a computer mouse to drag an object across a screen. Previously Andersen's lab had shown that a different signal in the posterior parietal cortex, called the "goal signal," can also be used to directly jump an object from one point to another.
"This goal signal is much faster for reaching a goal than a trajectory signal," Andersen says. "Fast goal decoding is very advantageous for rapid sequences such as typing. Our new study shows that the posterior parietal cortex codes the trajectory as well as the goal, which makes this brain area an attractive target for neural prosthesis. Not only does this increase the versatility and the number of prosthetic applications, but it also makes the decoding easier since the trajectories can be better estimated if the goal is known."
The paper, "Forward Estimation of Movement State in Posterior Parietal Cortex," will be published in a future print issue the Proceedings of the National Academy of Sciences but is now available online. First author, Grant Mulliken, was a graduate student at Caltech and is now a postdoctoral fellow at the Massachusetts Institute of Technology; coauthor Sam Musallam was a postdoctoral fellow at Caltech and is currently an assistant professor at McGill University in Montreal, Canada.