Generating emotional body expressions for socially assistive robots has been gaining increased attention to enhance the engagement and empathy in human-robot interaction. In this paper, we propose a new model of emotional body expression for the robot inspired by social and emotional development of infant from their parents. An infant is often influenced by social referencing, meaning that they perceive their parents’ interpretation about emotional situations to form their own interpretation. Similar to the infant development case, robots can be designed to generate representative emotional behaviors using self-organized neural networks trained with various emotional behavior samples from human partners. We demonstrate the validity of our emotional behavior expression through a public human action dataset, which will facilitate the acquisition of emotional body expression of socially assistive robots.
Nguyen Tan Viet Tuyen, Sungmoon Jeong, Nak Young Chong. Learning Human Behavior for Emotional Body Expression in Socially Assistive Robotics, URAI2017, International Conference on Ubiquitous Robots and Ambient Intelligence, Maison Glad Jeju, Jeju, Korea from June 28-July 1, 2017