The recognised importance of physical experience in empathic exchanges has led to the development of touch sensors for human–robot affective interaction. Most of these sensors, implemented as matrix of pressure sensors, are rigid, cannot be fabricated in complex shapes, cannot be subjected to large deformations, and usually allow to capture only the contact event, without any information about the interaction context. This paper presents a tactile flux sensor able to capture the entire context of the interaction including gestures and patterns. The sensor is made of alternate layers of sensitive and insulating silicone: the soft nature of the sensor makes it adaptable to complex and deformable bodies. The main features from electrical signals are extracted with the principal component analysis, and a self-organising neural network is in charge for the classification and spatial identification of the events to acknowledge and measure the gesture. The results open to interesting applications, which span from toy manufacturing, to human-robot interaction, and even to sport and biomedical equipment and applications.

Touch sensor for social robots and interactive objects affective interaction

MAZZEI, DANIELE;DE MARIA, CARMELO;VOZZI, GIOVANNI
2016-01-01

Abstract

The recognised importance of physical experience in empathic exchanges has led to the development of touch sensors for human–robot affective interaction. Most of these sensors, implemented as matrix of pressure sensors, are rigid, cannot be fabricated in complex shapes, cannot be subjected to large deformations, and usually allow to capture only the contact event, without any information about the interaction context. This paper presents a tactile flux sensor able to capture the entire context of the interaction including gestures and patterns. The sensor is made of alternate layers of sensitive and insulating silicone: the soft nature of the sensor makes it adaptable to complex and deformable bodies. The main features from electrical signals are extracted with the principal component analysis, and a self-organising neural network is in charge for the classification and spatial identification of the events to acknowledge and measure the gesture. The results open to interesting applications, which span from toy manufacturing, to human-robot interaction, and even to sport and biomedical equipment and applications.
2016
Mazzei, Daniele; DE MARIA, Carmelo; Vozzi, Giovanni
File in questo prodotto:
File Dimensione Formato  
output(1).pdf

accesso aperto

Tipologia: Documento in Pre-print
Licenza: Creative commons
Dimensione 967.07 kB
Formato Adobe PDF
967.07 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/814139
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 6
social impact