When a bonobo wrinkles its nose or lifts its browridge, those expressions do more than pose for the camera. Those flashes of movement form a language that researchers have struggled to pin down.
Now an international team has sharpened that language map by adapting a human facial‑coding toolkit to Pan paniscus (bonobos), cataloging 28 distinct moves and showing that 22 stem from specific muscles.
Dr. Catia Correia‑Caeiro of Leipzig University’s Institute of Biology leads the project, which unites experts from Germany, Switzerland, France, the United Kingdom, and the United States.
The team borrowed the Facial Action Coding System, a 1978 catalog that labels every visible muscle shift with a numeric action unit, then upgraded the scheme for bonobos.
“This adaptation of ChimpFACS for bonobos fills an important gap in our ability to study facial expressions across different primate species,” said Dr. Correia‑Caeiro, who is also affiliated with the Max Planck Institute for Evolutionary Anthropology.
The new manual lets scientists compare bonobo, chimpanzee, and human faces move for move.
Humans can muster about 30 action units, yet bonobos rival their chimpanzee cousins with 22, demonstrating the social weight of facial cues rather than sheer anatomical range.
The catalog even adds a rarely recorded brow‑lowering move, AU41, that deepens the glabella without the human‑exclusive corrugation.
The team sifted through roughly 55 hours of high‑definition video featuring 241 bonobos in zoos, sanctuaries, and the wild, coding each frame by hand.
The footage revealed three independent ear gestures, ears forward, ears elevated, ears flattened, expanding the expressive palette beyond eyes and mouth.
The researchers also spotted brief lid‑tightening blinks that accompany nose wrinkles, hinting at graded intensity in bonobo signals.
The experts noticed that some action units, such as cheek‑raisers and glabella‑lowerers, often fire in rapid succession – suggesting combination rules that mirror syllables in spoken language.
Despite their similarities to chimpanzees, bonobos have distinct facial features that complicate direct comparison.
Subtle traits, like lighter lip coloration, reduced brow-ridge prominence, and more flexible lower lips, affect how facial movements appear and are perceived. These differences can confuse coders if using chimpanzee references alone.
Without a dedicated coding system, earlier studies on bonobos varied wildly in the number of described expressions, from just 5 to as many as 46.
That inconsistency made it hard to compare results or identify patterns. A tailored toolkit ensures more reliable tracking and opens the door to rigorous, reproducible analysis.
“By better understanding their facial expressions, we can more accurately evaluate their emotional states and overall well‑being,” Correia‑Caeiro explained. Caretakers can now read those signals with new confidence.
Bonobos are endangered, with population estimates hovering between 29,000 and 50,000 individuals according to the IUCN Red List.
Knowing when a captive ape is tense or relaxed helps zoos tweak enrichment schedules, separate clashing individuals sooner, and design habitats that cut social friction before it spirals into injury.
Shared action units let scientists trace which facial abilities pre‑date the split between humans and the Pan lineage six million years ago.
A 2020 multi‑species review argues that facial mobility scales with social complexity, placing bonobos near the top of the primate scale.
Bonobos live in female‑bonded, relatively tolerant groups, so extra facial nuance may diffuse conflicts without resorting to violence, offering a living model for hypotheses on early human cooperation.
The NetFACS project applies network analysis to thousands of coded faces, revealing that great‑ape expressions form dense interaction graphs where muscle combos shift with context.
In bonobos, preliminary graphs show tighter clustering of play‑related units than threat units, hinting that relaxed social atmospheres encourage richer facial “vocabularies.”
Computer‑vision engineers are training algorithms on the new bonobo library, aiming for real‑time welfare dashboards that flag stress before keepers can see it.
The advanced systems could eventually watch forest camera traps, letting conservationists gauge mood across remote populations without intrusive tagging.
Comparative biologists plan to pit bonobo, chimpanzee, and human FACS data against group size, food‑sharing rates, and dominance styles.
If facial complexity tracks tolerance as predicted, the findings could rewrite textbooks on primate social evolution.
The study is published in the journal PeerJ.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–