A team of researchers from George Washington University (GW) has recently investigated how the human brain processes objects in its visual system, and found that object perception strongly depends on people’s prior knowledge and experience of the specific objects. These findings could have important implications in a variety of applied settings, including medical displays, cognitive assistants, and product and environmental design.
“Since the way we perceive objects determines how we interact with them, it is important to visually process them quickly and with high detail,” explained study senior author Sarah Shomstein, a professor of Cognitive Neuroscience at GW. “However, the way our eyes perceive and process an object can be different depending on what we know about this object. Our study shows, for the first time, that if we recognize an object as a tool, we perceive it faster but with less detail. If we recognize an object as a non-tool, we perceive it slower but with higher detail.”
The scientists showed participants images of objects which can be easily manipulated by hand, such as coffee mugs, screwdrivers, or snow shovels, as well as of objects that are rarely manipulated by hand, like a potted plant, a fire hydrant, or a picture frame. For half of the experiment a small gap could be cut out at the bottom of each object, while for the other half, the object would flicker on the computer screen. The participants were asked to report the presence of the gap or flicker, in order for the researchers to assess the speed and detail of object processing, and to identify the brain regions responsible for object perception.
The results revealed that objects usually manipulated by hand are perceived faster than non-manipulable objects, making it easier for participants to notice the flicker, while objects that are usually not manipulated by hand are perceived with greater detail, making it easier to spot the small gaps in them.
“The differences in perception between ‘mugs’ and ‘plants’ in both speed and detail of perception means that these objects are sorted by the visual system for processing in different brain regions,” said study lead author Dick Dubbelde, a recent doctoral graduate at GW. “In other words, your knowledge of the object’s purpose actually determines where in the brain object processing will occur and how well you will perceive it.”
Moreover, if researchers interfered with object recognition by making it harder to realize if the object is manipulable or not – for instance, by turning it upside-down – the differences in the speed and detail of perception vanish.
These findings – published in the journal Psychological Science – could explain individual differences in object perception, by showing that previous knowledge and experience structures perception.