World Scientific
  • Search
  •   
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

GENERATION OF HUMANOID ROBOT'S FACIAL EXPRESSIONS FOR CONTEXT-AWARE COMMUNICATION

    https://doi.org/10.1142/S0219843613500138Cited by:16 (Source: Crossref)

    Communication between humans and robots is a very important aspect in the field of Humanoid Robotics. For a natural interaction, robots capable of nonverbal communication must be developed. However, despite the most recent efforts, robots still can show only limited expression capabilities. The purpose of this work is to create a facial expression generator that can be applied to the 24 DoF head of the humanoid robot KOBIAN-R. In this manuscript, we present a system that based on relevant studies of human communication and facial anatomy can produce thousands of combinations of facial and neck movements. The wide range of expressions covers not only primary emotions, but also complex or blended ones, as well as communication acts that are not strictly categorized as emotions. Results showed that the recognition rate of expressions produced by this system is comparable to the rate of recognition of the most common facial expressions. Context-based recognition, which is especially important in case of more complex communication acts, was also evaluated. Results proved that produced robotic expressions can alter the meaning of a sentence in the same way as human expressions do. We conclude that our system can successfully improve the communication abilities of KOBIAN-R, making it capable of complex interaction in the future.

    References

    • M.   Argyle and P.   Trower , Person to Person: Ways of Communicating ( Harper and Row , 1980 ) . Google Scholar
    • M.   Knapp , Essentials of Nonverbal Communication ( Holt, Rinehart and Winston , New York , 1980 ) . Google Scholar
    • A. Mehrabian and M. Wiener, J. Pers. Soc. Psychol. 6(1), 109 (1967). Crossref, Web of ScienceGoogle Scholar
    • T.   Kishi et al. , Development of expressive robotic head for bipedal humanoid robot with wide moveable range of facial parts and facial color , Proc. 19th CISMIFToMM Symp. Robot Design, Dynamics, and Control ( 2012 ) . Google Scholar
    • A. Mehrabian and J. Firar, J. Consult. Clin. Psychol. 33(3), 330 (1969). Crossref, Web of ScienceGoogle Scholar
    • M. L. Patterson, J. L. Powell and M. G. Lenihan, J. Nonverbal Behav. 10(1), 41 (1986), DOI: 10.1007/BF00987204. Crossref, Web of ScienceGoogle Scholar
    • A.   Mehrabian , Silent Messages , 1st edn. ( Wadsworth Publishing Company , 1971 ) . Google Scholar
    • I. Poggi and C. Pelachaud, Signals and meanings of gaze in animated faces, Language, Vision and Music: Selected Papers from the 8th Int. Workshop on the Cognitive Science of Natural Language Processing (John Benjamins, 2002), pp. 133–144 . Google Scholar
    • C. Pelachaud and I. Poggi, J. Visualization Comput. Animation (Special Issue on Graphical Autonomous Virtual Humans) 13(5), 301 (2002), DOI: 10.1002/vis.299. CrossrefGoogle Scholar
    • V. Vinayagamoorthy, M. Gilles, A. Steed, E. Tanguy, X. Pan, C. Loscos and M. Slater, Building expression into virtual characters, Eurographics Conf. State of the Art Reports (Goldsmiths Research Online, Vienna, 2006) . Google Scholar
    • M.   Argyle , Bodily Communication , 2nd edn. ( Methuen & Co Ltd , 1998 ) . Google Scholar
    • J.   Cassell et al. , Animated conversation: Rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents , Proc. SIGGRAPH ( 1994 ) . Google Scholar
    • S.   Duck , Human Relationships , 3rd edn. ( SAGE , 1998 ) . Google Scholar
    • S. R.   Fussell , The Verbal Communication of Emotions: Interdisciplinary Perspectives ( Lawrence Erlbaum Associates , 2002 ) . CrossrefGoogle Scholar
    • S. Marcos, J. Gomez-Garcia-Bermejo and E. Zalama, Interacting Comput. 22, 176 (2010), DOI: 10.1016/j.intcom.2009.12.002. Crossref, Web of ScienceGoogle Scholar
    • S. Marcoset al., Nonverbal communication with a multimodal agent via facial expression recognition, IEEE Int. Conf. Robotics and Automation (2011) pp. 1199–1204. Google Scholar
    • J. Cassell et al. , MACK: Media lab autonomous conversational kiosk , Proc. Imagina '02 ( 2002 ) . Google Scholar
    • T. W. Bickmore, Relational agents: Effecting change through human-computer relationships, Ph.D. thesis, Massachusetts Institute of Technology (2003) . Google Scholar
    • J. Bates, Commun. ACM Special Issue on Agents  (1994). Google Scholar
    • R. G.   Harper , A. N.   Weins and J. D.   Marrazzo , Non Verbal Communication: The State of the Art ( John Wiley & Sons , New York , 1978 ) . Google Scholar
    • S.   Weitz , Nonverbal communication: Readings with Commentary ( Oxford University Press , New York , 1974 ) . Google Scholar
    • H. Aviezeret al., Psychol. Sci. 19, 724 (2008), DOI: 10.1111/j.1467-9280.2008.02148.x. Crossref, Web of ScienceGoogle Scholar
    • R. J. Dolan, J. S. Morris and B. de Gelder, Proc. Nat. Acad. Sci. U.S.A. 98, 10006 (2000), DOI: 10.1073/pnas.171288598. Crossref, Web of ScienceGoogle Scholar
    • G. Poutoiset al., Cortex 41, 49 (2005). Crossref, Web of ScienceGoogle Scholar
    • B. de Gelder and J. Van der Stock, The Handbook of Face Perception (Oxford University Press, USA, 2011) pp. 535–550. Google Scholar
    • H. R. Knudsen and L. H. Muzekari, J. Nonverbal Behav. 7, 202 (1983), DOI: 10.1007/BF00986266. Crossref, Web of ScienceGoogle Scholar
    • D. W. Massaro and P. B. Egan, Psychonomic Bull. Rev. 3(2), 215 (1996), DOI: 10.3758/BF03212421. Crossref, Web of ScienceGoogle Scholar
    • R.   Birdwhistell , Kinesics and Context ( University of Pennsylvania Press , Philadelphia , 1970 ) . Google Scholar
    • R. Beiraet al., Design of the robot-cub (iCub) head, IEEE Int. Conf. Robotics and Automation (ICRA) (2006) pp. 94–100. Google Scholar
    • J. H.   Oh et al. , Design of android type humanoid robot albert HUBO , Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems ( 2006 ) . Google Scholar
    • H. Miwaet al., Development of a new human-like head robot WE-4, IEEE/RSJ Int. Conf. Intelligent Robots and Systems (2002) pp. 2443–2448. Google Scholar
    • N. Endo and A. Takanishi, J. Robot. Mechatronics 23(6), (2011). Google Scholar
    • C. Breazeal, Int. J. Hum. Comput. Interact. 59, 119 (2002). Crossref, Web of ScienceGoogle Scholar
    • A.   Beck et al. , Interpretation of emotional body language displayed by robots , Proc. 3rd Int. Workshop on Affective Interaction in Natural Environments ( 2010 ) . Google Scholar
    • K. Itohet al., Mechanical design of emotion expression humanoid robot WE-4RII, 16th CISM-IFToMM Symp. Robot Design, Dynamics and Control (2006) pp. 255–262. Google Scholar
    • Y. Oguraet al., Development of a new humanoid robot WABIAN-2, I, Proc. IEEE Int. Conf. Robotics and Automation (2006) pp. 76–81. Google Scholar
    • P.   Ekman , W. V.   Friesen and J. C.   Hager , The Facial Action Coding System , 2nd edn. ( Weidenfeld & Nicolson , London , 2002 ) . Google Scholar
    • J. C.   Hager and P.   Ekman , J. Social Psychophysiology: A Sourcebook , eds. T.   Cacioppo and R. E.   Petty ( Guilford , New York , 1983 ) . Google Scholar
    • N. Stoiber, R. Seguier and G. Breton, Comput. Animation Virtual Worlds 21, 39 (2010), DOI: 10.1002/cav.331. Crossref, Web of ScienceGoogle Scholar
    • J. Saldienet al., Int. J. Soc. Robot. 2, 377 (2010), DOI: 10.1007/s12369-010-0067-6. Crossref, Web of ScienceGoogle Scholar
    • T.   Ribeiro and A.   Paiva , The illusion of robotic life: Principles and practices of animation for robots , Proc. Seventh Annual ACM/IEEE int. Conf. Human-Robot Interaction ( 2012 ) . Google Scholar
    • I. Poggi, The lexicon of the conductor's face, in CSNLP-8 (Cognitive Science and Natural Language Processing), Proc. Workshop on "Language, Vision and Music," ed. P. McKevitt (1999), pp. 67–54 . Google Scholar
    • I.   Poggi , Towards the Alphabet and the lexicon of gesture, gaze and touch , Multimodality of Human Communication. Theories, Problems and Applications, Virtual Symp. , ed. P.   Bouissac ( 2002 ) . Google Scholar
    • I.   Poggi , Le parole del corpo ( Carrocci , Roma , 2006 ) . Google Scholar
    • H.   Kobayashi et al. , Toward rich facial expression by face robot , IEEE Int. Symp. Micromechatronics and Human Science ( 2002 ) . Google Scholar
    • C. A. Smith and H. S. Scott, The Psychololgy of Facial Expression, A componential approach to the meaning of facial expressions, eds. J. Russell and J. Fernandez-Dols (Cambridge University Press, Cambridge, UK, 1997) pp. 229–254. CrossrefGoogle Scholar
    • D. L. Bimler and G. V. Paramei, Spanish J. Psychol. 9(1), 19 (2009). Crossref, Web of ScienceGoogle Scholar
    • I.   Poggi and C.   Pelachaud , Embodied Conversational Agents , eds. J.   Cassell et al. ( MIT press , Cambridge , 2000 ) . Google Scholar
    • R.   Plutchik , Emotions and Life: Perspectives from Psychology, Biology, and Evolution ( American Psychological Association , Washington, DC , 2002 ) . Google Scholar
    • E. Douglas-Cowie et al., HUMAINE project, Mid Term Report on Database Exemplar Progress, Available at: http://emotion-research.net/projects/humaine/deliverables/D5gbibitemb20final.pdf . Google Scholar
    • S. Raudys and R. P. W. Duin, Pattern Recogn. Lett. 19(6), 385 (1998), DOI: 10.1016/S0167-8655(98)00016-6. Crossref, Web of ScienceGoogle Scholar
    • G.   Trovato et al. , A development of facial expressions generator for emotion expressive humanoid robot , IEEE-RAS Int. Conf. Humanoid Robots ( 2012 ) . Google Scholar
    • G.   Trovato et al. , A cross-cultural study on generation of culture dependent facial expressions of humanoid social robot , Int. Conf. Social Robotics ( 2012 ) . Google Scholar