Scientists learn to read what mice are thinking through facial expressions
11-17-2025

Scientists learn to read what mice are thinking through facial expressions

In new mouse experiments, an international team showed that tiny facial signals can reveal a problem-solving strategy as reliably as neuron recordings.

The work was done at the Champalimaud Foundation in Lisbon, and it points to real benefits and risks. The results open doors for basic research, yet they also raise clear questions about mental privacy.

Cameras capture facial signals

The work was led by Zachary Mainen, Ph.D. His research focuses on neural circuits that support learning and decision making.

Mice faced a simple choice while cameras filmed their faces. They sampled two water spouts with sugary rewards that changed over time, and they adjusted strategies as conditions shifted.

Researchers analyzed the videos with machine learning, computer methods that find patterns in data. They compared what the facial signals revealed with what groups of neurons recorded from brain activity could tell.

“To our surprise, we found that we can get as much information about what the mouse was ‘thinking’ as we could from recording the activity of dozens of neurons,” said Mainen, a principal investigator at the Champalimaud Foundation.

Facial signals linked to decisions

The team tracked subtle facial features, frame by frame, using Facemap. This tool summarizes movements of whiskers, nose, eyelids, and jaw into compact signals that are easy to analyze.

They then linked those facial signals to a decision variable, a running value the brain keeps to guide choices. That linkage held even when the mouse was quietly considering options rather than acting.

Key patterns appeared before or during choices, not only after the mouse moved. That timing matters because it shows faces can reflect computations, not just the movements that follow them.

Because the videos are noninvasive – do not enter the body or damage tissue – this approach can watch mental processes without surgery. That is useful for long experiments and for comparisons across animals.

Shared facial signals across mice

A striking result was that different animals showed similar facial codes for the same inner state. The same mix of tiny movements mapped to the same strategy across multiple mice.

That reproducibility suggests a shared blueprint for expressing thought on the face. It means the signals are not idiosyncratic tics but structured patterns that can be decoded.

Parts of these facial signals traced back to the secondary motor cortex, a brain area that helps plan movements. When activity there was dampened, the face carried less information about the decision variable.

Earlier work showed that spontaneous behaviors can drive widespread brain activity. The new findings go further by tying micro-level facial signals to specific latent computations.

What this could mean for health

These facial readouts could become a biomarker. Scientists could monitor how thinking strategies change with learning, aging, or illness.

They could also track whether a drug shifts how an animal evaluates evidence. That would give researchers a fast, low stress readout that complements brain recordings.

Because the facial signals are so accessible, this method might aid preclinical studies. It could help in building better animal models of psychiatric or neurological conditions.

Any translation to humans would need careful testing. Human internal states are expressed through faces, voices, posture, and context, and that mix is richer and less stereotyped than in mice.

Protecting mental privacy

Videos may carry more than behavior. The researchers noted that while the findings are scientifically exciting, they also raise important questions about how to safeguard mental privacy in the future.

Those concerns are not theoretical. The OECD issued a 2019 recommendation that urges safeguards for personal brain data, consent, and transparency in neurotechnology.

A related policy discussion focuses on neural data, which is information derived from brain or nervous system signals. Rules for how such data are collected, shared, and used will only grow in importance.

Practical steps exist today. Clear labeling, opt-in consent, and strict limits on reuse can reduce risks while allowing responsible research to proceed.

Reality check and next steps

This is a mouse study in a controlled setting. The cameras watched trained animals during a specific task, and the algorithms learned patterns from many trials.

The results say nothing about reading full thoughts as sentences. They show that specific decision computations leave traceable signatures on the face.

Future work will test new tasks, longer timescales, and stress or distraction. It will also probe how stable these facial codes remain across development and disease.

Method-wise, researchers can refine models that connect facial signals to brain activity. Better models should improve sensitivity while reducing false positives.

Science and the future of facial signals

The study adds weight to an idea called embodied cognition, the view that body and brain jointly shape thinking. Here, the body’s small movements mirror internal computations as they unfold.

It also shows the value of simple tools used wisely. Off-the-shelf cameras and open methods can reveal hidden structure in behavior.

Openness will matter for progress. The authors shared data locations and code resources to help others replicate and extend the findings.

Community standards can speed adoption while avoiding hype. Clear benchmarks, preregistered analyses, and data sharing can keep the field on solid ground.

The study is published in Nature Neuroscience.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe