Ron Petrick's Webpages

IPAB Workshop talk

Knowledge-Level Planning for Task-Based Human-Robot Interaction, R. Petrick, slides from a talk presented at the IPAB Workshop, School of Informatics, University of Edinburgh, UK, 2015-01-29.

[ slides ]


A robot coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. In many settings, this involves the use of social signals like gaze, facial expression, and language. In this talk, I describe an application of knowledge-level planning to the problem of task-based social interaction, using a robot that must interact with multiple human agents in a simple bartending domain. States are inferred from low-level sensors, using vision and speech as input modalities. High-level actions are selected by the PKS (Planning with Knowledge and Sensing) planner, which constructs plans with task, dialogue, and social actions, and provides an alternative to current mainstream methods of interaction management. In particular, PKS treats the action selection problem as an instance of planning with incomplete information and sensing, and builds plans by reasoning about how the robot's knowledge changes due to action. Examples are provided from a series of drink ordering scenarios involving human participants. This work was performed as part of the EU-funded project JAMES: Joint Action for Multimodal Embodied Social Systems.