Estimating Emotional Intensity from Body Poses for Human-Robot Interaction, IEEE SMC 2018 (to appear)
Abstract:
Equipping social and service robots with the ability to perceive human emotional intensities during an interaction is in increasing demand. Most of existing work focuses on determining which emotion(s) participants are expressing from facial expressions but largely overlooks the emotional intensities spontaneously revealed by other social cues, especially body languages. In this paper, we present a real-time method for robots to capture fluctuations of participants’ emotional intensities from their body poses. Unlike conventional joint-position-based approaches, our method adopts local joint transformations as pose descriptors which are invariant to subject body differences and the pose sensor positions. In addition, we use an Long Short-Term Memory-Recurrent Neural Network (LSTM-RNN) architecture to take the specific emotion context into account when estimating the intensities from body poses. Through dataset evaluations, we show that the proposed method delivers good performances on test dataset. Also, a series of succeeding field tests on a physical robot demonstrates that the proposed method effectively estimates subjects emotional intensities in real-time. And the robot equipped with our method is perceived to be more emotion-sensitive and more emotionally intelligent.