Article image
12-04-2018

Brain imaging traces how humans view natural settings to boost AI

Experts at Yale University are studying brain measurements to predict where people’s eyes will move as they view images of natural settings. According to the researchers, a better understanding of the human visual system will help to advance technology driven by artificial intelligence, such as autonomous vehicles.

Study co-author Marvin Chun is the Richard M. Colgate Professor of Psychology. He explained, “We are visual beings and knowing how the brain rapidly computes where to look is fundamentally important.”

Eye movement is a very popular research topic, and experts have previously established which direction a gaze will most likely be focused among different scenes of the natural environment. However, the way that the brain coordinates this ability, which is critical for survival, is not entirely understood.

In a previous experiment based on “mind reading,” Professor Chun’s team managed to reconstruct facial images based on MRI brain imaging data alone.

For the current study, Professor Chun and lead author Thomas P. O’Connell used the same approach to demonstrate that they could predict where people would direct their attention and gaze while looking at natural scenes. They did this by analyzing the brain data with deep convolutional neural networks, which are models that are widely used in artificial intelligence systems.

“The work represents a perfect marriage of neuroscience and data science,” said Professor Chun. “People can see better than AI systems can. Understanding how the brain performs its complex calculations is an ultimate goal of neuroscience and benefits AI efforts.”

The study is published in the journal Nature Communications.

By Chrissy Sexton, Earth.com Staff Writer

News coming your way
The biggest news about our planet delivered to you each day
Subscribe