APSIPA Transactions on Signal and Information Processing > Vol 8 > Issue 1

Analysis and generation of laughter motions, and evaluation in an android robot

Carlos Toshinori Ishi, ATR Hiroshi Ishiguro Laboratories, Japan, carlos@atr.jp , Takashi Minato, ATR Hiroshi Ishiguro Laboratories, Japan, Hiroshi Ishiguro, ATR Hiroshi Ishiguro Laboratories, Japan
 
Suggested Citation
Carlos Toshinori Ishi, Takashi Minato and Hiroshi Ishiguro (2019), "Analysis and generation of laughter motions, and evaluation in an android robot", APSIPA Transactions on Signal and Information Processing: Vol. 8: No. 1, e6. http://dx.doi.org/10.1017/ATSIP.2018.32

Publication Date: 25 Jan 2019
© 2019 Carlos Toshinori Ishi, Takashi Minato and Hiroshi Ishiguro
 
Subjects
 
Keywords
Emotion expressionLaughterMotion generationHuman–robot interactionNon-verbal information
 

Share

Open Access

This is published under the terms of the Creative Commons Attribution licence.

Downloaded: 1236 times

In this article:
I. INTRODUCTION 
II. RELATED WORK 
III. ANALYSIS DATA 
IV. PROPOSED MOTION GENERATION IN AN ANDROID 
V. DISCUSSION 
VI. CONCLUSION 

Abstract

Laughter commonly occurs in daily interactions, and is not only simply related to funny situations, but also to expressing some type of attitudes, having important social functions in communication. The background of the present work is to generate natural motions in a humanoid robot, so that miscommunication might be caused if there is mismatching between audio and visual modalities, especially in laughter events. In the present work, we used a multimodal dialogue database, and analyzed facial, head, and body motion during laughing speech. Based on the analysis results of human behaviors during laughing speech, we proposed a motion generation method given the speech signal and the laughing speech intervals. Subjective experiments were conducted using our android robot by generating five different motion types, considering several modalities. Evaluation results showed the effectiveness of controlling different parts of the face, head, and upper body (eyelid narrowing, lip corner/cheek raising, eye blinking, head motion, and upper body motion control).

DOI:10.1017/ATSIP.2018.32