Ron Petrick's Webpages

Workshop talk

Knowledge-Level Planning for Task-Oriented Social Interaction, R. Petrick, slides from a talk presented at the Workshop on Formalised Social Intelligence, Technical University of Denmark, Copenhagen, Denmark, 2014-12-18.

[ slides (coming soon) ]


An intelligent agent coexisting with humans must not only be able to perform physical tasks, but must also be able to interact with humans in a socially appropriate manner. In many settings, this involves the use of social signals like gaze, facial expression, and language. To take full advantage of such information, the results of social signals should be represented as part of the agent's domain model, and utilised by its reasoning and decision making processes, for instance when selecting actions to perform in the world. Using the example of a robot in a simple bartending scenario, I will describe an application of knowledge-level planning to the problem of task-based social interaction, and discuss how knowledge, action, and social information are modelled within the PKS (Planning with Knowledge and Sensing) planner as an instance of planning with incomplete information and sensing. I will also discuss current work that seeks to extend this approach to reason about multiagent knowledge, and to encode certain types of social protocols.