Human minds see what we "expect" to see, not what we actually see
06-21-2025

Human minds see what we "expect" to see, not what we actually see

The smell of toast drifts from the kitchen. Across the room, you notice a hand reach for a butter knife. Without thinking, you already sense the next move.

Your brain predicts that a slice of bread will meet a dab of butter, and it usually gets the call right. That quiet, automatic guesswork keeps everyday scenes feeling smooth rather than jumbled.

Behind this split-second foresight is a set of brain regions known to researchers as the “action observation network (AON),” a collection that fires up whenever we watch another person reach, grasp, or tinker with an object.

Decades of lab tests – often showing only one- to two-second video snippets – mapped out the basics of that network.

Yet real life rarely serves actions in single bites. It offers full recipes, packed with intention and order. That difference set the stage for a revealing new study.

Action observation network in the brain

The project came from a team at the Netherlands Institute for Neuroscience led by Christian Keysers and Valeria Gazzola.

Keysers puts it simply: “What we would do next becomes what our brain sees,” a reminder that prediction lies at the heart of perception.

Earlier work suggested that information flows in one direction – visual regions pass details to parietal and premotor hubs, which then plan a matching response. The new study asked whether that pipeline flips when a viewer can forecast the next step.

To probe the question, researchers created two versions of everyday scenes – say, building a sandwich or folding a shirt.

In the natural cut, actions played out in their expected order; in the scrambled cut, the same clips were shuffled. Volunteers watched both while their brain activity was recorded.

Because some participants were epilepsy patients already implanted with intracranial electrodes for medical monitoring, the team captured electrical signals deep inside the cortex with millisecond precision.

What you and your brain see

When the action order made sense, the brain worked differently than textbook diagrams predicted. Feedback signals ran from higher-level motor regions down to the sensory cortex.

That top-down push quieted visual areas, as if the brain decided it could lighten the workload because the next move was practically a foregone conclusion.

In contrast, the jumbled cut forced the cortex to lean on incoming sight alone, reviving the classic feed-forward path from eye to hand.

The switch was clearest in the premotor cortex. This area, best known for preparing our own movements, pulsed first when scenes unfolded logically. Electrical rhythms then traveled backward toward regions that handle touch and vision.

The result hints that motor memories – habits such as slicing a roll – prime the brain to handle what our eyes are about to see.

Brain uses memory to ‘see’

Valeria Gazzola puts the reversal in plain terms: “Now, information was actually flowing from the premotor regions, that know how we prepare breakfast ourselves, down to the parietal cortex, and suppressed activity in the visual cortex.”

She adds, “It is as if they stopped to see with their eyes, and started to see what they would have done themselves.”

The quotes capture a striking idea: in familiar settings, action perception may depend more on stored know-how than raw sight.

That idea slots neatly into the larger framework of predictive coding, a theory proposing that the brain constantly compares expectations to incoming data, then sends error signals when reality misses the mark.

By showing the mechanism in natural sequences, the study strengthens the view that prediction is not a special feature reserved for high-stakes moments. It is a default mode woven into everyday social life.

Efficiency through feedback loops

Suppressing the visual cortex during a well-known routine might sound risky, yet it saves energy and speeds comprehension.

If the brain trusts its guess, it trims redundant sensory checks, freeing resources for surprises – such as an unexpected ingredient swap.

Electroencephalography and functional MRI data from the same study revealed lower metabolic demand when actions were predictable, backing the idea that foreknowledge cuts neural overhead.

These feedback loops also clarify how we keep track of other people in noisy or visually cluttered places.

By leaning on motor memories, the brain can stitch together patchy views into a coherent scene, a skill vital for teamwork on a crowded street or communication across a dinner table.

This is weird, why does it matter?

Understanding how the motor system shapes perception could help fine-tune rehabilitation after stroke. Therapies that train patients to anticipate movement sequences – rather than simply imitate single motions – might better rewire damaged circuits.

The findings also inspire engineers working on assistive robots and augmented-reality glasses. Systems that predict human intent a fraction of a second early can choose safer paths, hand over tools at the right moment, or flag anomalies before accidents unfold.

Beyond medicine and machines, the work nudges us to appreciate the hidden labor our brains perform. Every time we pass the salt or catch a tossed set of keys, layers of neural forecasting keep the exchange seamless.

By mapping that choreography with intracranial precision, the researchers have shown that much of what we ‘see’ comes from inside, projected forward by experience rather than painted on solely by the eye.

Future studies on how the brain sees

Future studies will test whether the same feedback pattern appears in more complex social exchanges – playing music together, learning a new sport, or reading facial expressions in fast-moving conversations.

If motor-based prediction proves central across those domains, training programs that expand a person’s movement repertoire could sharpen perceptual skills as well.

For now, the take-home message is clear. When an action unfolds in a familiar rhythm, the brain lets memory call many of the shots. That shortcut keeps life running smoothly, one well-timed guess at a time.

The full study was published in the journal Cell Reports.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe