Artificial sensory substitution plays a crucial role in different domains, including prosthetics, rehabilitation and assistive technologies. The sense of touch has historically represented the ideal candidate to convey information on the external environment, both contact-related and visual, when the natural action-perception loop is broken or not available. This is particularly true for blind people assistance, in which touch elicitation has been used to make content perceivable (e.g. Braille text or graphical reproduction), or to deliver informative cues for navigation. However, despite the significant technological advancements for what concerns both devices for touch-mediated access to alphanumeric stimuli, and technology-enabled haptic navigation supports, the majority of the proposed solutions has met with scarce acceptance in end users community. Main reason for this, in our opinion, is the poor involvement of the blind people in the design process. In this work, we report on a user-centric approach that we successfully applied for haptics-enabled systems for blind people assistance, whose engineering and validation have received significant inputs from the visually-impaired people. We also present an application of our approach to the design of a single-cell refreshable Braille device and to the development of a wearable haptic system for indoor navigation. After a summary of our previous results, we critically discuss next avenues and propose novel solutions for touch-mediated delivery of information for navigation, whose implementation has been totally driven by the feedback collected from real end-users.
A User-Centered Approach to Artificial Sensory Substitution for Blind People Assistance
Bettelani G. C.;Leporini B.;Averta G.;Bianchi M.Ultimo
Supervision
2022-01-01
Abstract
Artificial sensory substitution plays a crucial role in different domains, including prosthetics, rehabilitation and assistive technologies. The sense of touch has historically represented the ideal candidate to convey information on the external environment, both contact-related and visual, when the natural action-perception loop is broken or not available. This is particularly true for blind people assistance, in which touch elicitation has been used to make content perceivable (e.g. Braille text or graphical reproduction), or to deliver informative cues for navigation. However, despite the significant technological advancements for what concerns both devices for touch-mediated access to alphanumeric stimuli, and technology-enabled haptic navigation supports, the majority of the proposed solutions has met with scarce acceptance in end users community. Main reason for this, in our opinion, is the poor involvement of the blind people in the design process. In this work, we report on a user-centric approach that we successfully applied for haptics-enabled systems for blind people assistance, whose engineering and validation have received significant inputs from the visually-impaired people. We also present an application of our approach to the design of a single-cell refreshable Braille device and to the development of a wearable haptic system for indoor navigation. After a summary of our previous results, we critically discuss next avenues and propose novel solutions for touch-mediated delivery of information for navigation, whose implementation has been totally driven by the feedback collected from real end-users.File | Dimensione | Formato | |
---|---|---|---|
icnr_2020 (1).pdf
accesso aperto
Tipologia:
Documento in Pre-print
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
6 MB
Formato
Adobe PDF
|
6 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.