Ron Petrick's Webpages

Invited talk

Knowledge-Level Planning for Task-Based Robot Action and Human-Robot Interaction, R. Petrick, slides from a talk presented at Sabancı University, İstanbul, Turkey, 2015-02-11.

[ slides ]


A robot coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. In many settings, this involves the use of social signals like gaze, facial expression, and language, in addition to traditional robot actions like moving and grasping. In this talk, I describe an application of knowledge-level planning to task-based action and social interaction, motivated by a robot that must interact with multiple human agents in a simple bartending domain. States are inferred from low-level sensors, using vision and speech as input modalities. High-level actions are selected by the PKS (Planning with Knowledge and Sensing) planner, which constructs plans with task, dialogue, and social actions, and provides an alternative to current mainstream methods of interaction management. In particular, PKS treats the action selection problem as an instance of planning with incomplete information and sensing, and builds plans by reasoning about how the robot's knowledge changes due to action. The same planner is also used to provide high-level control for low-level motion planning, using a facility for integrating externally-defined reasoning processes with PKS. An extension to the basic PKS system is also described for representing multi-agent knowledge. Examples are provided from a set of robot domains, which illustrate the applicability of these techniques to a broad range of robot planning applications involving incomplete knowledge, real-world geometry, and multiple robots.