Can blind users hear & feel their way?
User Test of an audio-haptic sensory substitution product to support blind users in wayfinding
Imagine this...
Worldwide, a staggering number of 253 million people live with vision impairment. They mostly rely on the white cane as assistance for finding their way. However, the white cane ist limited to its reach, only 150 cm ahead. Using it, blind users are neither able to find the best path ahead nor notice moving or elevated objects.
Thus, a big team of engineers, researchers and software developers from all across Europe set out to improve the independence of blind people. After 2 years of hard work, the team proudly presented a first prototype: a system that could possibly enable blind users to hear & feel their sourroundings!
They were excited - would it work?
​

Project steps
1. The product & task
"Can blind users hear & feel their way with our product?"

The photos above show the product prototype as worn by the user with camera unit, haptic belt with motors and a processing unit on the back.
Youtube Link below shows a 4 min summary video of how the product was developed and tested.
-
What happened?
In the previous two years, a team of more than 90 researchers & engineers from all across Europe had developed a system aiming to improve the mobility of visually impaired users. Worn by a user, it recognises obstacles by a camera and translates them to audio and vibration patterns.
-
My task:
Design, conduct and analyse a usability test to evaluate if blind users can easily use the product prototype to unterstand their surroundings and successfully avoid obstacles.
-
The product:
A hard & software setup conveying an haptical & auditory representation of the environment, continuously and in real time. The system comprises of a camera headset, ear phones for the sound signals and a haptic belt for the vibration signals. ---> see photos & video (left)
Challenge: The project required the coordination & communication with an hybrid and cross-functional team of more than 90 members across 9 institutions from Iceland, Romania, Italy, Poland, Hungary, consisting of psychologists, technical engineers, software developers and mobility instructors.
2. Drawing up a study plan
"How to measure successful usage in this context?"


Photos of example obstacle scenes with cardboard boxes, where users were asked to passing without collision.
-
Key research question:
After the completing the complementary training program, can blind users use the prototype to understand their environment and thus successfully navigate?
-
How to measure successful use?​
We based it on the 3 components of usability:-
efficiency: time for obstacle course completion
-
efficacy: number of collisions (brushing objects) and number of correctly identified obstacles (pointing at them before passing)
-
satisfaction: questionnaire with 11 items (likert scale, exploring ease of use, confidence, satisfaction)
-
-
What does successful mean?
-
we decided to use performance with the white cane as a benchmark for comparison
-
also the performance should improve with increasing training time
-
-
The obstacle course: We needed a standardised, mobile obstacle scene that could be easily rearranged - without posing a risk of injury to users. ​
Challenge: The obstacle scenes needed to be of comparable difficulty but also new each round, otherwise we would just measure memory of scenes.
Solution: We defined rules for setting up the scenes, so that obstacles were placed randomly but within a defined range, e.g. 15 m distance between start & goal, 10 obstacles of the same size, minimal distances, etc
​
Challenge: The required extensive training of 8 hours per participant, topped off by 5 performance measure rounds per participant, each with and without cane, and, last but least, a considerable delay in delivery of the prototype lead us to limit the participant number to 6.
​
​
3. Train, measure, repeat

Photos of a blind participant training navigation with 2 obstacles in the virtual training environment, while sitting on a laptop using a joystick.
Below: The real-world training set up with card board boxes. Above: the according same scene from the real-world training shown as recognised by the system camera

-
Virtual environment training:
The first 4 hours users trained in a virtual environment to get acquainted with the system through carefree exploration. Users learned to distinguish properties of single objects (e.g., size, direction, distance), followed by properties of multiple objects. Then, using a joystick, users actively navigated through virtual scenes.
-
Real world training:​ The next 4 hours of training took place with cardboard boxes as obstacles. Starting with exploration of an empty room, followed by scenes with a single obstacle. Users were asked to point, judge distance and size. Finally, users trained scenes with up to 10 obstacles, navigating towards a loudspeaker playing music on the opposite side of the room.
How much are users improving with training?
-
​When did we measure performance?
-
​1 x baseline before training
-
4 x each after every 2 hours of training
-
-
How did we measure performance?
Completion time, collisions, number of identified obstacles when completing the standardised ​obstacle course. This was measured 3x each time, when users were using....-
the white cane only (baseline)
-
the white cane & product
-
the product only
-
4. Answering the question
"Can blind users navigate with our product? - Yes, they can!"

The course of collision frequency with obstacles per scene when using the product, plotted for each individual as a function of training progress.

Median collision frequency per scene for the three assistive device conditions: the product only, white cane only, and both assistive devices.
-
Can users hear & feel obstacles?
YES! Training with the product leads to surprisingly rapid improvements in mobility, quickly reaching performance levels seen with the white cane. The data shows that after just 4 hours of training, users completed the scenes while using the product with as few collisions as when using the white cane.
-
Can users navigate efficiently?
NOT YET! More training with the product would be required to also decrease the completion time to a comparable level with the white cane.
Challenge: By asking users to point out obstacles before passing we emphasised accuracy over speed. Interestingly, users tended to slow down once they gained an understanding of the audio-haptic representation, trying to apply lessons learned from training.
Challenge: There were considerable individual differences, which limits the generalisation of results. We still were able to identify subgroups of users, of which one seems to be a highly promising target group.
​
​Overall, the data provides encouraging evidence for the usability of the product, indicating that blind users are able adapt to a haptic-auditory representation of their environment within an adequate time frame of training.
​
​​
5. Spreading the good news


Presenting our product at research conferences. And celebrating the Innovation Radar Prize at ICT 2018.
​
-
Since this this was a European Union funded research project, the key project outcomes were related to
-
reporting to the European Commission
-
publishing the insights in scientific conferences and papers
-
-
My output contributions:
-
coordinating and writing the internal, international product evaluation report to the European commission
-
scientific paper I:
"Evaluation of an audio-haptic sensory substitution device for enhancing spatial awareness for the visually impaired" >> see here -
scientific paper II:
"Blind wayfinding with physically-based liquid sounds" >> see here
-
​​
-
Our product was well received and generally acclaimed, e.g. we were awarded with the Innovation Radar Prize Winer “Tech for Society” Award at ICT 2018, Vienna.
​
​​
Happy End!
Finally...
After this extensive first testing with blind participants, the team knew that their mission was a success:
Unlike the white cane, their system could indeed enable blind users to hear & feel their environment! The system allows users to determine obstacle-free spaces and plan the best path ahead. Further tests showed that the system also allows blind users to detect moving and elevated objects, thereby really improving spatial awareness.
The team was positive that their contribution could improve the wellbeing for many visually impaired individuals!

