We describe a preliminary experiment to track the emotions of actors and audience in a theater play through machine learning and AI. During a 40-min play in Zurich, eight actors were equipped with body-sensing smartwatches. At the same time, the emotions of the audience were tracked anonymously using facial emotion tracking. In parallel, also the emotions in the voices of the actors were assessed through automatic voice emotion tracking. This paper demonstrates a first fully automated and privacy-respecting system to measure both audience and actor satisfaction during a public performance.
Measuring audience and actor emotions at a theater play through automatic emotion recognition from face, speech, and body sensors
Gloor P. A.
Primo
;Guerrazzi E.Ultimo
2020-01-01
Abstract
We describe a preliminary experiment to track the emotions of actors and audience in a theater play through machine learning and AI. During a 40-min play in Zurich, eight actors were equipped with body-sensing smartwatches. At the same time, the emotions of the audience were tracked anonymously using facial emotion tracking. In parallel, also the emotions in the voices of the actors were assessed through automatic voice emotion tracking. This paper demonstrates a first fully automated and privacy-respecting system to measure both audience and actor satisfaction during a public performance.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.