Note: I will be on sabbatical from 1st September 2017 to 31st August 2018, and may be only reading and responding to email sporadically.

I am a Reader (Associate Professor) at the School of Informatics at The University of Edinburgh developing AI algorithms and architectures that are able to understand humans and support collaboration between them better. Making AI safe and ensuring it behaves in responsible ways is an important part of this vision.

I lead the Agents Group at Edinburgh, and have been very lucky to work with many amazing PhD students, postdocs, and visitors in the group over the years on topics related to this overall research area, e.g. fair task recommendation mechanisms, scalable multiagent planning algorithms, or automated norm synthesis (just to give a few examples).

Until recently, I was Director of the Centre for Intelligent Systems and their Applications, and have been involved in several large research initiatives supported by more than £10 million of external funding. I co-ordinate the ESSENCE network, which focuses on using human communication techniques to enable AI systems to negotiate and evolve meaning. I led work on social orchestration systems for coordinating collective human activity in the SmartSociety project. In the UnBias project, I am developing fair data-driven algorithms that help address people's concerns about algorithmic bias.

I'm always interested in exceptional PhD students - please get in touch if have a strong background in AI (especially multiagent systems, automated planning, knowledge representation, game theory, symbolic learning). You should read the PhD study at CISA page, and you can find some example PhD topics here. I am a champion for increasing the percentage of female PhD students in our School, so if you are female and reading this, please consider yourself specifically encouraged to apply!

You might also be interested in my CV, my research, publications, teaching and other activities. Follow me on Twitter for news, announcements, and random musings. Thanks for stopping by.