Ron Petrick's Webpages

Invited talk

Knowledge-Level Planning for Task-Based Human-Robot Interaction, R. Petrick, slides from a talk presented at the College of Computing and Informatics, Drexel University, Philadelphia, USA, 2014-06-20.

[ slides (coming soon) ]

Abstract

A robot coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. In many settings, this involves the use of social signals like gaze, facial expression, and language. In this talk, I describe an application of knowledge-level planning to the problem of task-based social interaction, using a robot that must interact with multiple human agents in a simple bartending domain. States are inferred from low-level sensors, using vision and speech as input modalities. High-level actions are selected by the PKS (Planning with Knowledge and Sensing) planner, which constructs plans with task, dialogue, and social actions, and provides an alternative to current mainstream methods of interaction management. In particular, PKS treats the action selection problem as an instance of planning with incomplete information and sensing, and builds plans by reasoning about how the robot's knowledge changes due to action. Examples are provided from a series of drink ordering scenarios which have been tested on a real robot with human participants. This work is part of a larger European Union-funded project called JAMES: Joint Action for Multimodal Embodied Social Systems.