New Wayfinding Apps Transform Indoor Navigation for the Visually Impaired

A breakthrough in accessible technology from UC Santa Cruz introduces smartphone apps designed to aid blind and visually impaired individuals in navigating indoor spaces safely and independently.

Navigating indoor spaces independently can be a daunting task for those without sight. Thanks to a new development from UC Santa Cruz, new smartphone apps now offer a transformative solution for the blind and visually impaired.

Roberto Manduchi, a professor of computer science and engineering at UC Santa Cruz, and his team have made significant strides in accessible technology with the introduction of these apps. Recognizing the unique challenges faced by the blind community, particularly in unfamiliar indoor environments, Manduchi’s years of research have culminated in tools that enhance safety and independence.

“Moving about independently in a place that you don’t know is particularly difficult, because you don’t have any visual reference — it’s very easy to get lost. The idea here is to try to make this a little bit easier and safer for people,” Manduchi said in a news release.

Published in the journal ACM Transactions on Accessible Computing, the paper outlines the development and efficacy of two smartphone applications designed for indoor wayfinding and backtracking. These apps utilize audio cues, enabling navigation without the need to constantly hold the smartphone, which can attract undue attention and pose security risks.

Unlike other smartphone-based wayfinding systems that require visual interaction, these apps are designed with convenience and discretion in mind. They allow users to navigate with their smartphones tucked away, which is particularly beneficial since many users rely on a guide dog or cane.

While some companies have implemented indoor wayfinding solutions in specific locations, such as airports and stadiums, Manduchi’s team’s solution leapfrogs these by using a smartphone’s built-in sensors. This approach avoids the high costs and scalability issues associated with additional infrastructure.

Akin to GPS, Manduchi’s team’s apps chart an indoor path using building maps, leveraging the smartphone’s inertial sensors, accelerometers and gyroscopes to track movements and orientations. This method incorporates sophisticated particle filtering techniques to correct for potential inaccuracies, ensuring users don’t “walk through walls.”

The backtracking app is particularly noteworthy for its simplicity and effectiveness. It allows users to retrace their steps by recognizing magnetic field anomalies in the environment. Together with voice instructions, users receive navigational cues through vibrations on smartwatches, aiding them in maintaining safety and focus.

“Sharing responsibility, in my opinion, is the right approach,” Manduchi added. “As a philosophy, you cannot rely on technology alone. That is also true when you drive a car — if it says turn right, you don’t just immediately turn right, you look for where the junction is. You need to work with the system.”

The effectiveness of these apps was demonstrated through extensive testing at UC Santa Cruz’s Baskin Engineering building, where users successfully navigated complex corridors and turns.

With plans to integrate artificial intelligence features, Manduchi’s team aims to further enhance the experience by enabling users to get scene descriptions from photographs of their surroundings. Additionally, they aspire to streamline access to building maps via open-source software ecosystems.

This novel technology promises to revolutionize how the visually impaired navigate indoor spaces, bringing unprecedented autonomy and safety to their daily lives.