Every fleeting glance you take of your surroundings sets off a chain reaction inside your skull, shuttling retinal signals to a tiny midbrain hub. New findings show that this relay does more than position your eyes; it also helps you decide what those eyes are seeing.
Scientists from the University of Chicago have now traced high‑level judgment back to the same circuit that pilots a reflexive gaze.
David Freedman, professor of neurobiology at the university, says the discovery redraws some of the most familiar maps of the mind.
The superior colliculus sits just above the brainstem and appears in fish, birds, and mammals alike, hinting at an ancestry stretching half a billion years.
Its layered tissue forms a two‑dimensional grid that rapidly transforms flashes of light into head and eye movements.
For decades, textbooks described the structure as a simple switchboard for saccades, the rapid jumps that refresh a scene several times a second. The cortex, by contrast, was said to handle every higher act, from object recognition to chess.
Growing evidence counters that neat divide. A 2017 review chronicled how collicular neurons encode attention and decision signals once thought exclusive to the cerebral mantle.
Those hints set the stage for a closer look at the midbrain in real time. Freedman’s group answered with electrodes and a clever behavioral trick.
They trained two Macaca mulatta monkeys to hold their eyes on a central point while labeling drifting dot patterns as belonging to one of two learned categories. Fruit‑juice rewards encouraged fast, accurate button presses rather than eye shifts.
During the task, arrays of fine wires captured spikes in the posterior parietal cortex, a well‑known decision center, and simultaneously inside the colliculus.
Timing analysis revealed that category information emerged in both places at almost the same moment.
More surprising, the midbrain signal was stronger and more consistent than the cortical one, indexing the upcoming choice with remarkable fidelity. That meant the ancient hub was not just echoing the cortex but leading the conversation.
“This is a really surprising place to find these kinds of cognitive signals because this area of the brain is traditionally associated with simpler spatial orienting behaviors and even reflexive functions,” said Freedman. He notes that the finding challenges a long‑held hierarchy that placed the neocortex firmly on top.
Correlation, however, does not guarantee causation. To probe necessity, the team injected minute doses of muscimol to silence collicular neurons for about 30 minutes.
The monkeys still stared at the fixation point and pushed buttons with normal speed, showing that motor paths remained largely intact. Even so, their category judgments slipped toward chance until normal firing returned.
A 2021 study by Michele Basso and colleagues had already shown that damping the same region biases simpler motion decisions in primates. The new work extends that causal role to abstract rules untied to any spatial cue.
“Our results show us that this area is really important for the task,” Freedman added. Without its input, the cortex alone could not salvage performance.
The collicular sheet represents space in a neat polar grid, each cell preferring a particular direction and eccentricity. Such organization may offer high‑speed scaffolding for loftier computations.
Freedman suspects that the brain projects non‑spatial information onto that grid to exploit wiring already optimized for rapid competition among options.
Spatial coding could act as a common currency, letting disparate sensory and memory traces jostle under a single rulebook.
Separate teams have found midbrain neurons that single out faces mere milliseconds after stimulus onset, bypassing the slower ventral stream.
Others report collicular bursts that end deliberations once evidence reaches a threshold, acting like an internal referee.
“Some of this data might be telling us that the spatial parts of the brain are getting recruited into helping us perform these non-spatial cognitive functions,” said Barbara Peysakhovich, first author of the study. Her remark underscores how evolution can co‑opt one circuit for many jobs.
Functional MRI supports the idea in humans. Activity flares in the colliculus when volunteers search cluttered scenes or shift attention without moving their eyes.
A study that probed eye‑driven Go/No‑Go decisions in progressive supranuclear palsy linked slow saccades to impaired evidence accumulation, implicating the same midbrain route.
Such overlap suggests that eye‑movement disorders and cognitive decline may share a single lesion site.
Computational neuroscientists are now modeling collicular circuits to design cameras that spot hazards before full scene analysis is complete. The Chicago data offer a fresh template for those algorithms.
Ultimately, the study blurs a tidy border between perception and reflection. It invites a broader search for additional double‑duty nodes throughout the brain.
Freedman plans to test whether similar signals appear in people using magnetoencephalography combined with deep‑brain electrical recordings during surgery. Such work could clarify how much of the primate finding generalizes to everyday human reasoning.
Another priority is mapping the pathways that carry the category code into and out of the midbrain. Anatomical studies already show dense links from frontal eye field to colliculus as well as ascending fibers that reach the cortex via the thalamus.
Untangling those loops may reveal how a single decision circulates between brainstem and cortex before reaching awareness. The results could also help clinicians fine‑tune deep‑brain stimulators aimed at restoring gaze and cognition in movement disorders.
The study is published in Nature Neuroscience.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–