Neuron unites two theoretical models on motion detection
Computation of motion by T4 cells in the fly brain more complex than previously believed
As indicated by their name, photoreceptor cells in the eye respond to light: is an image point bright or dark? They do not indicate the direction of a movement. This perception only arises in the brain through the comparative computations of light signals coming from adjacent image points. Engineers, physicists and neurobiologists have been debating the exact nature of these computations for around 50 years. Scientists from the Max Planck Institute of Neurobiology have now combined two theories about these computations, which were previously considered to be alternative hypotheses – and discovered that they are carried out in a single neuron.
Flies are usually very difficult to catch. No wonder – they invest around ten percent of their brain in the detection and processing of image motion. For the fly, a hand approaches in slow motion and the fly’s evasive manoeuvre has long been triggered before any real danger arises. Scientists have been researching for decades how the fly brain can perceive and process movements so quickly and accurately. “Our goal is slowly coming into view, and we are close to completely decoding the neuronal circuit of motion perception in the fly,” says Alexander Borst, who has been working on this problem with his Department at the Max Planck Institute of Neurobiology for quite some time. The scientists have now come one step closer to the answer: They have provided experimental data that combine two theories previously considered as alternatives.
Over 50 years ago, two rival theoretical models were developed which attempted to explain how information about the direction of motion could be computed from the signals transmitted by adjacent image points. One theory states that light stimuli along one direction, referred to as the preferred direction, enhance each other. In contrast, the other model assumes that light stimuli along the opposite direction, known as the null direction, suppress each other. In both cases, a weak direction-selective signal arises, which must then be further processed and amplified. “Interestingly, however, we discovered that already the first cells that respond to the motion stimuli – the T4 and T5 cells – display strong directional selectivity,” reports Alexander Borst.
In order to resolve this discrepancy, the neurobiologists refined a test set-up so that they were able to stimulate individual functional columns of the fly brain in succession and record the responses of the directionally-selective T4 cells. The data they collected and the corresponding computer simulations were clear: T4 cells intensify the input signals when they run along their preferred direction and suppress them when they run along the null direction. Both of the proposed mechanisms are thus implemented in the T4 cells of the fly brain, and what was thought to be an ‘either-or’ scenario became an ‘as-well-as’ one. “It’s no wonder that these cells can differentiate so accurately between motion directions,” says Jürgen Haag, first author of the study. “Nature’s solution is more complicated than either of the proposed models.”
In their computer simulations of such a combined mechanism, the Max Planck researchers required three different input signals to the T4 cells. Interestingly, however, T4 cells receive input signals from four other cells. This would suggest that the fourth – still unknown – input signal to the T4 cells contains a further surprise in relation to the final computation. “Needless to say, we would now also like to know what kind of information the T4 cells receive via this fourth channel,” says Alexander Borst, explaining the next step in the research process. “We will then be able to show for the first time how information about motion direction is calculated in a neural network from individual image points.”
SM/HR