How the brain decides what to focus on - and what to ignore
06-28-2025

How the brain decides what to focus on - and what to ignore

We often think we’re aware of everything around us, but our brains are constantly deciding where to focus.

Think about walking down a busy street. You might glance at a flashy car or a bright advertisement. But when it comes time to cross the street, those distractions vanish. Instead, your brain zooms in on the walk signal, moving vehicles, and people nearby.

That switch in focus – how we go from being drawn to attention-grabbing sights to zoning in on a specific task – has puzzled scientists for years.

Researchers at Yale University wanted to understand not just what grabs our attention, but how we shift it toward things that really matter in the moment.

What the brain chooses to see

Most attention research looks at what draws our eyes – a sudden movement, a bold color, or something new. But this study focused on something different: what happens when our attention is deliberately steered toward a goal.

“We have a limited number of resources with which we can see the world,” said Ilker Yildirim, assistant professor of psychology in Yale’s Faculty of Arts and Sciences and senior author of the study.

“We think of these resources as elementary computational processes; each perception we experience, such as the position of an object or how fast it’s moving, is a result of exerting some number of these elementary perceptual computations.”

The brain focuses on the task at hand

To explore this, the team created a model called “adaptive computation.” This system, like a mental rationing tool, allocates our perceptual resources to the things most relevant to what we’re trying to do.

If you’re trying to cross a street safely, the model prioritizes the walk sign over a flashy distraction.

“Our model reveals a mechanism by which human attention identifies what things in a dynamic scene are relevant to the goal at hand, and then rations perceptual computations accordingly,” said study co-author Mario Belledonne, a student in Yale’s Graduate School of Arts and Sciences.

Tracking focus in real time

To test their ideas, the researchers ran experiments using a screen filled with colored dots. Participants were asked to track certain moving dots while ignoring others. The dots moved unpredictably, forcing people to adjust their focus constantly.

As the dots flashed briefly, participants were asked to press a key when they noticed one. This helped the researchers map how attention moved – almost second by second.

The adaptive computation model was able to predict where and when these shifts would happen, based on which dots were more relevant to the task.

Mental effort involved in a task

In another version of the task, participants still tracked a handful of dots, but the researchers changed how many “distractor” dots appeared and how fast they moved.

Afterward, people rated how hard the task felt. Interestingly, the model’s predictions matched the participants’ feelings: when the model worked harder to track the targets, participants said the task felt harder.

In other words, the model gave insight into something very human – the mental effort we feel when concentrating on a tough task.

Mimicking the brain’s ability to focus

One surprising takeaway from this work is how our brains selectively “ignore” things. That eye-catching car or glowing billboard? If it doesn’t help with what we’re trying to do, it might just disappear from our awareness.

“We want to work out the computational logic of the human mind, by creating new algorithms of perception and attention, and comparing the performance of these algorithms to that of humans,” Yildirim said.

This approach could even guide how we build future artificial intelligence. Unlike current AI systems that try to absorb everything, this new model mimics how humans sometimes miss obvious things – not due to error, but because it’s a smart way to focus.

“We think this line of work can lead to systems that are a bit different from today’s AI, something more human-like,” Yildirim said. “This would be an AI system that when tasked with a goal might miss things, even shiny things, so as to flexibly and safely interact with the world.”

The full study was published in the journal Psychological Review.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe