The creation of social empathie communication channels between social robots and humans has started to become reality. Nowadays, the development of empathie and affective agents is giving to scientists another way to explore the social dimension of human beings. In this work, we introduce the FACE humanoid project that aims at creating a social and emotional android. FACE is an android head with an articulated neck mounted on a passive body. In order to enable FACE to perceive and express emotions, two dedicated engines have been developed. A sensory apparatus able to perceive the 'social world', and a facial expressions generation engine that allows the robot to express its synthetic emotions. The system has been also integrated with an attention-based gaze generation component that allows the robot to autonomously follow a conversation between its partners. The developed framework has been implemented and tested in several standard human-robot interaction settings. Results demonstrated the promising social capabilities of the robot to perceive and convey emotions to humans through the generation of emotional perceivable facial expressions and socially aligned behaviour. © 2014 IEEE.

Recognition and Expression of Emotions by a Symbiotic Android Head

MAZZEI, DANIELE;ZARAKI, ABOLFAZL;DE ROSSI, DANILO EMILIO
2015-01-01

Abstract

The creation of social empathie communication channels between social robots and humans has started to become reality. Nowadays, the development of empathie and affective agents is giving to scientists another way to explore the social dimension of human beings. In this work, we introduce the FACE humanoid project that aims at creating a social and emotional android. FACE is an android head with an articulated neck mounted on a passive body. In order to enable FACE to perceive and express emotions, two dedicated engines have been developed. A sensory apparatus able to perceive the 'social world', and a facial expressions generation engine that allows the robot to express its synthetic emotions. The system has been also integrated with an attention-based gaze generation component that allows the robot to autonomously follow a conversation between its partners. The developed framework has been implemented and tested in several standard human-robot interaction settings. Results demonstrated the promising social capabilities of the robot to perceive and convey emotions to humans through the generation of emotional perceivable facial expressions and socially aligned behaviour. © 2014 IEEE.
2015
978-147997174-9
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/602067
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? ND
social impact