Northwestern University engineers unveil MobilePoser, a state-of-the-art mobile app capable of real-time full-body motion capture using consumer devices. Discover its potential to revolutionize gaming, fitness, health care and beyond.
Engineers at Northwestern University have developed a groundbreaking system, MobilePoser, which can perform real-time, full-body motion capture using only a smartphone. This innovative technology leverages sensors already present in everyday consumer electronics, such as smartphones, smartwatches and wireless earbuds.
“Running in real time on mobile devices, MobilePoser achieves state-of-the-art accuracy through advanced machine learning and physics-based optimization, unlocking new possibilities in gaming, fitness and indoor navigation without needing specialized equipment,” lead researcher Karan Ahuja, an assistant professor of computer science at Northwestern’s McCormick School of Engineering, said in a news release.
The system, showcased at the 2024 ACM Symposium on User Interface Software and Technology in Pittsburgh on October 15, represents a significant leap in accessibility for motion capture technology.
Unlike traditional systems that require expensive equipment and specialized environments, MobilePoser democratizes access by using inertial measurement units (IMUs) already integrated into mobile devices.
Limitations of Current Systems
Traditional motion capture systems involve actors donning sensor-covered suits and working in specialized environments, often costing upwards of $100,000. Even consumer-level solutions, like Microsoft Kinect, require stationary cameras and are not viable for mobile applications.
“This is the gold standard of motion capture, but it costs upward of $100,000 to run that setup,” Ahuja added. “We wanted to develop an accessible, democratized version that basically anyone can use with equipment they already have.”
Breakthrough Technology
To address these limitations, Ahuja’s team utilized IMUs, which contain accelerometers, gyroscopes and magnetometers. While standard IMUs in smartphones lack the fidelity for precise motion capture, the Northwestern team enhanced their functionality using a multi-stage AI algorithm trained on high-quality motion capture data.
The algorithm estimates joint positions, rotations, walking speed, direction and foot-ground contact. A physics-based optimizer further refines these estimates, ensuring they align with real-world constraints, such as joint flexibility.
“The accuracy is better when a person is wearing more than one device, such as a smartwatch on their wrist plus a smartphone in their pocket,” added Ahuja. “But a key part of the system is that it’s adaptive. Even if you don’t have your watch one day and only have your phone, it can adapt to figure out your full-body pose.”
Potential Impacts
While MobilePoser is poised to revolutionize gaming with more immersive experiences, its applicability spans far beyond entertainment. Fitness enthusiasts can use the app to monitor their posture during exercise, ensuring correct form and improving workout effectiveness. Medical professionals can leverage the technology for detailed analysis of patients’ mobility and gait, surpassing simple step-counting.
“Right now, physicians track patient mobility with a step counter. That’s kind of sad, right? Our phones can calculate the temperature in Rome. They know more about the outside world than about our own bodies,” Ahuja added. “We would like phones to become more than just intelligent step counters. A phone should be able to detect different activities, determine your poses and be a more proactive assistant.”
To foster further advancements, Ahuja’s team has made their pre-trained models, data pre-processing scripts and model training code available as open-source software. The app will soon be accessible on iPhone, AirPods and Apple Watch.