A new study shows that artificial intelligence can pick up faint shifts in the eyes and mouth that relate to mild depressive symptoms. The signals are too subtle for most people to notice during a quick first impression.
The focus here is subthreshold depression, a mild set of depressive symptoms that does not meet clinical criteria but still raises future risk. It can shape how someone comes across without making them look obviously upset or tense.
The project was led by Eriko Sugimori at Waseda University in Japan. Her team used simple, short videos to test whether technology could spot what our eyes usually miss.
Depression often dampens positive expression more than negative expression, a pattern supported by a previous analysis. That shift can change day to day social feedback, which can then quietly reinforce low mood.
Missing weak signals has a cost. People can move through school or work with fewer friendly responses and not know why that keeps happening.
“As concerns around mental well-being have been rising, I wanted to explore how subtle nonverbal cues, such as facial expressions, shape social impressions and reflect mental health using artificial intelligence-based facial analysis,” said Sugimori.
The aim was practical early notice rather than diagnosis. Early notice matters because support and monitoring can start before symptoms solidify into a major episode.
Sixty-four Japanese undergraduates recorded self-introduction videos while facing the camera. A separate group of 63 students rated how expressive, natural, friendly, and likable the speakers seemed. In addition, all of the students completed the Beck Depression Inventory II (BDI-II).
Using OpenFace, an open-source tool built on the Facial Action Coding System (FACS), the researchers tracked facial action units – small, distinct muscle movements like inner brow raises and mouth openings.
Separating the speakers from the raters avoided familiarity effects. The audio was muted so that judgments rested on expression rather than voice.
The clips were trimmed to the first 10 seconds to model an everyday first impression. That choice kept the task simple for future use in schools or workplaces.
Students with subthreshold depression scored lower on positive items. They looked less expressive, less natural, less friendly, and less likable – even though they were not judged as being more stiff, fake, or nervous.
The automated analysis found more frequent eye and mouth movements tied to depressive scores. Key signals included the inner brow raiser, the upper lid raiser, the lip stretcher, and mouth opening units.
Those subtle patterns linked with the self reported BDI-II scores. The associations were strong enough to stand out despite the clips being brief. Raters’ own mood scores did not shift their judgments. That suggests the effect came from the faces being viewed, not from the minds of the viewers.
Display rules shape how people show their feelings in public, and Japan tends to value restraint more than many Western settings. That context matters when interpreting a muted positive style.
East Asian observers often pay closer attention to the eyes, while Western observers read eyes and mouths more evenly. The present results highlight eye and mouth activity, which fits with those known patterns.
Because norms vary, these findings should be tested in other places and groups. Baseline expressivity differs, and so do expectations in social settings. Culture does not erase the signal. It does influence what counts as expressive enough to register as warm or likable.
Short self-introduction videos are easy to collect during orientation or wellness check ins. Automated scoring could flag people whose positive expressivity has dipped below their peers, inviting a human follow up.
“Our novel approach of short self introduction videos and automated facial expression analysis can be applied to screen and detect mental health in schools, universities, and workplaces,” said Sugimori. The method keeps the bar low for participation.
The same approach could slot into digital health platforms that already use webcams. It would work best when paired with quick questionnaires and clear consent. Human review must stay in the loop. An alert should lead to a conversation, not a label.
The sample was limited to Japanese undergraduates and relied on self reporting. The tool does not diagnose depression, and it should not be used to make decisions about grades, jobs, or access to services.
Privacy and fairness need front seat attention. People should know exactly what is recorded, how long it is kept, and who can see it.
“Overall, our study provides a novel, accessible, and noninvasive artificial intelligence based facial analysis tool for early detection of depression, enabling early interventions and timely care of mental health,” concluded Sugimori. That promise depends on careful rollout and transparent guardrails.
Future work can test longer time frames, repeated measures, and diverse communities. Experts can also explore whether changes in depression signs and action units track recovery during counseling.
The study is published in the journal Scientific Reports.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–