Ron Petrick's Webpages

Seminar talk

JAMES: Joint Action for Multimodal Embodied Social Systems, R. Petrick, slides from a talk presented to the Dialogue and Interaction Working Group (DIG), School of Informatics, University of Edinburgh, UK, 2011-02-14.

[ slides ]

Abstract

In recent years, robot developers have begun to consider the social aspects of robot behaviour: a robot coexisting with humans must not only be able to successfully carry out physical tasks in the world, but must also be able to interact with humans in a socially appropriate manner. Achieving this behaviour requires endowing a robot with the ability to recognise, understand, and generate multimodal social signals (e.g., gesture, facial expression, language, etc.) in order to interpret and respond to humans in a realistic manner.

JAMES (Joint Action for Multimodal Embodied Social Systems) is a new EU FP7 project (coordinated by the University of Edinburgh) exploring the problem of social interaction in multi-agent environments. JAMES aims to develop a socially intelligent robot that combines task-based behaviour with the ability to understand and respond to a wide range of embodied, multimodal, communicative signals in a socially appropriate manner. To do this, JAMES plans to apply the results of studies into human social communicative behaviour, together with the development of new technical components for a humanoid robot, with the goal of demonstrating the resulting system in a bartending scenario with realistic, open-ended, multi-party interactions.

In this talk, I will present an overview of JAMES and highlight its main research themes, with particular emphasis on the role of automated planning and reasoning in the project.