The brain can rationally overlook biases when making decisions
The decisions that people make are often governed by a pre-existing bias or belief. However, a new study from neuroscientists at Columbia University has revealed that the brain can be surprisingly rational, putting a bias aside in order to apply logical reasoning while making a decision.
The research suggests that the brain makes evidence a priority during the decision making process. The study also highlights how prior knowledge is evaluated and updated over time as new information is introduced to the brain.
Dr. Michael Shadlen is a principal investigator at Columbia’s Mortimer B. Zuckerman Mind Brain Behavior Institute and the senior author of the research.
“As we interact with the world every day, our brains constantly form opinions and beliefs about our surroundings,” said Dr. Shadlen. “Sometimes knowledge is gained through education, or through feedback we receive. But in many cases we learn, not from a teacher, but from the accumulation of our own experiences. This study showed us how our brains help us to do that.”
The team set out to investigate whether existing knowledge is modified when new or conflicting evidence is presented.
The researchers developed a series of trials in which study participants judged whether a group of dots was moving to the left or to the right across a computer screen. As new groups of dots continued to appear, the individuals were also asked to judge whether they thought the computer program had an underlying bias.
The participants were not aware that a bias had been programmed into the computer that caused the dots to move toward one direction more often than the other.
“The bias varied randomly from one short block of trials to the next,” said study first author Dr. Ariel Zylberberg. “By altering the strength and direction of the bias across different blocks of trials, we could study how people gradually learned the direction of the bias and then incorporated that knowledge into the decision-making process.”
The experts took two different approaches to evaluating the participants: by monitoring the influence of bias in the participant’s decisions and their confidence in those decisions and by asking people to report the most likely direction of movement in the block of trials. Both approaches showed that participants used sensory evidence to modify their beliefs about directional bias of the dots.
“Originally, we thought that people were going to show a confirmation bias, and interpret ambiguous evidence as favoring their preexisting beliefs,” said Dr. Zylberberg. “But instead we found the opposite: People were able to update their beliefs about the bias in a statistically optimal manner.”
The researchers believe that two different situations were being considered simultaneously in the brain – one in which the bias existed and one in which it did not.
“When we look hard under the hood, so to speak, we see that our brains are built pretty rationally,” said Dr. Shadlen. “Even though that is at odds with all the ways that we know ourselves to be irrational.”
This study is published in the journal Neuron.