Thea is a concept for an artificially-intelligent, on-the-go navigation assistant for the blind and visually-impaired. Using the continuous positioning abilities that 5G wireless technology will enable, Thea understands the most effective way to get around and guides users accordingly. As a system, Thea communicates granular directionality through a set of wearable haptic pads and a voice-activated user interface. By first interpreting natural speech, Thea then provides non-intrusive audio and tactile feedback. Ultimately, Thea helps people with vision impairments to overcome mobility challenges and empowers them to more confidently navigate their world.
Vision impairment—commonly also known as blind or low-vision (BLV)—covers a broad spectrum of medical conditions, including cataracts, glaucoma, and macular degeneration. The 3.4 million people in the U.S. who are visually impaired span a wide impairment spectrum and include those who have been blind from birth to those who become blind later in their lives. Consequently, with different levels of impairment, comes different comfort levels for traveling outside of the home or exploring new environments. As a team of interns, we asked, “How could we design a product that could be understood and accessed by a mass population, while also celebrating individuality and addressing the specific needs of each user?”
As one of the world’s largest cities, New York City lags behind other major metro areas, like Sydney and Tokyo, when it comes to accessibility. The unpredictability of public transit, constantly changing sidewalk infrastructure, and building access all pose problems that are often overlooked by the sighted community. In New York City alone, 5% of the population is blind or has a severe visual impairment—that’s roughly 350,000 people. We aspired to use New York City as a hub for observational research and expanded our findings to create a scalable solution that could benefit people everywhere with vision impairments.
From our research with over 60 blind and low-vision people as well as experts, we recognized a few common problems, which include pre-planning, path details, and last-foot navigation. These common pain-points helped us to define an opportunity space that explores the intersection of the vision impairment and navigation.
Prior to venturing out of the house, people with vision impairments have to extensively pre-plan, which stifles spontaneity. On the way, they could encounter obstacles in their path—like major train delays or road construction—that derail their plans. These types of problems are typically easy for a sighted person to deal with, but for those with a visual impairment, they require immediate, up-to-date information that provides options for re-routing. The “last foot” in navigation is a serious challenge. For example, getting to a subway station is possible, but finding the correct platform proves difficult, especially if trains change.
As a result, we created Thea, a concept that offers a comprehensive solution to help the visually impaired independently and safely explore their environments.
Thea is a system that is easy to use and understand. It responds to requests like a real person, adapts to voice inputs, and provides an unparalleled navigation experience.
Visually-impaired individuals rely heavily on their sense of hearing so, in high-congestion and noisy areas, Thea switches from audio to haptic feedback. Thea conveys directional information in an intuitive way—its haptic “language” orients users and provides complex directional information.
Thea is made of cotton fiber, polymer elastic, and silicone, which are affordable, comfortable, and easily-replaceable materials. The pads can be placed anywhere on the body, giving users the ability to choose where they’d like to feel the directional vibration feedback.
The unique properties that 5G will bring could make Thea a reality in the not-so-distant future. 5G is the next generation of connectivity, with higher data bandwidths and lower latencies than ever before, it’ll allow for things like edge computing where all of the processing essentially takes place outside of devices, and will enable thinner phones, and haptic pads used with Thea. Using inputs like crowd-sourced maps, aggregated location data, and city-wide camera footage, Thea could provide fastest and most accurate navigation assistance the world has ever seen. The new capabilities of the next generation network will provide the infrastructure needed to power Thea, and allow each of us—including the blind and low-vision community—to go beyond our limits and better connect with the world around us.