Stevens Institute of Technology’s researchers are pioneering the use of AI-powered smartphone applications, PupilSense and FacePsy, to non-invasively detect depression by analyzing the eyes and facial expressions. These groundbreaking tools could transform mental health diagnostics.
Depression, affecting nearly 300 million people worldwide, remains a challenging condition to diagnose, especially when individuals are unwilling or unable to report their symptoms. Innovative technology, however, may offer a new path. Stevens Institute of Technology professor Sang Won Bae, alongside doctoral candidate Rahul Islam, is at the forefront of developing AI-driven smartphone applications that could serve as non-invasive tools for detecting depression.
Bae’s flagship project, PupilSense, leverages the ubiquity of smartphones by continuously capturing and analyzing snapshots of the user’s pupils.
“And since most people in the world today use smartphones daily, this could be a useful detection tool that’s already built and ready to be used,” Bae said in a news release.
The science behind PupilSense is rooted in decades of research linking pupillary reflexes to depressive episodes. By measuring pupil diameters in relation to the surrounding irises during brief, 10-second photo bursts, the system identifies potential signs of depression.
In an initial study involving 25 volunteers, PupilSense analyzed around 16,000 phone interactions, achieving a 76% accuracy rate in detecting depressive moods. This performance surpasses that of AWARE, the leading smartphone-based depression detection system to date.
“We will continue to develop this technology now that the concept has been proven,” Bae added.
The open-source version of PupilSense is already accessible on the GitHub platform, offering researchers and developers an opportunity to contribute to its evolution.
Another promising development from Bae and Islam’s lab is FacePsy, a system that interprets facial expressions to gauge emotional well-being.
“A growing body of psychological studies suggest that depression is characterized by nonverbal signals such as facial muscle movements and head gestures,” added Bae.
FacePsy operates discreetly in the smartphone’s background, capturing facial images when the device is used and then quickly deleting them to maintain user privacy.
Initial findings from the FacePsy pilot study revealed intriguing correlations, such as increased smiling linked to depressive moods, possibly as a coping mechanism or a study artifact.
Additional depressive signals included fewer facial movements in the mornings and specific head and eye movements. Eyes being more open in the mornings and evenings also hinted at underlying depression, suggesting that outward expressions might mask deeper emotional struggles.
“Other systems using AI to detect depression require the wearing of a device, or even multiple devices,” Bae added. “We think this FacePsy pilot study is a great first step toward a compact, inexpensive, easy-to-use diagnostic tool.”
The research on FacePsy was presented at the ACM International Conference on Mobile Human-Computer Interaction (MobileHCI) in Australia, offering further validation of its potential as a revolutionary tool for mental health monitoring.
These groundbreaking technologies herald a significant advancement in how we can non-invasively and efficiently detect depression, potentially making mental health care more accessible and proactive.