Artificial sensory substitution plays a crucial role in different domains, including prosthetics, rehabilitation and assistive technologies. The sense of touch has historically represented the ideal candidate to convey information on the external environment, both contact-related and visual, when the natural action-perception loop is broken or not available. This is particularly true for blind people assistance, in which touch elicitation has been used to make content perceivable (e.g. Braille text or graphical reproduction), or to deliver informative cues for navigation. However, despite the significant technological advancements for what concerns both devices for touch-mediated access to alphanumeric stimuli, and technology- enabled haptic navigation supports, the majority of the pro- posed solutions has met with scarce acceptance in end users community. Main reason for this, in our opinion, is the poor involvement of the blind people in the design process. In this work, we report on a user-centric approach that we successfully applied for haptics- enabled systems for blind people assistance, whose engineering and validation have received significant inputs from the visually-impaired people. We also present an application of our approach to the design of a single-cell refreshable Braille device and to the development of a wearable haptic system for indoor navigation. After a summary of our previous results, we critically discuss next avenues and propose novel solutions for touch-mediated delivery of information for navigation, whose implementation has been totally driven by the feedback collected from real end-users.
A user-centered approach to artificial sensory substitution for blind People assistance"egalitarian" approach for special users
Leporini B;
In corso di stampa
Abstract
Artificial sensory substitution plays a crucial role in different domains, including prosthetics, rehabilitation and assistive technologies. The sense of touch has historically represented the ideal candidate to convey information on the external environment, both contact-related and visual, when the natural action-perception loop is broken or not available. This is particularly true for blind people assistance, in which touch elicitation has been used to make content perceivable (e.g. Braille text or graphical reproduction), or to deliver informative cues for navigation. However, despite the significant technological advancements for what concerns both devices for touch-mediated access to alphanumeric stimuli, and technology- enabled haptic navigation supports, the majority of the pro- posed solutions has met with scarce acceptance in end users community. Main reason for this, in our opinion, is the poor involvement of the blind people in the design process. In this work, we report on a user-centric approach that we successfully applied for haptics- enabled systems for blind people assistance, whose engineering and validation have received significant inputs from the visually-impaired people. We also present an application of our approach to the design of a single-cell refreshable Braille device and to the development of a wearable haptic system for indoor navigation. After a summary of our previous results, we critically discuss next avenues and propose novel solutions for touch-mediated delivery of information for navigation, whose implementation has been totally driven by the feedback collected from real end-users.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.