Multi-Agent Intention Recognition and Progression

Michael Dann, Yuan Yao, Natasha Alechina, Brian Logan, Felipe Meneguzzi, John Thangarajah

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

For an agent in a multi-agent environment, it is often beneficial to be able to predict what other agents will do next when deciding how to act. Previous work in multi-agent intention scheduling assumes a priori knowledge of the current goals of other agents. In this paper, we present a new approach to multi-agent intention scheduling in which an agent uses online goal recognition to identify the goals currently being pursued by other agents while acting in pursuit of its own goals. We show how online goal recognition can be incorporated into an MCTS-based intention scheduler, and evaluate our approach in a range of scenarios. The results demonstrate that our approach can rapidly recognise the goals of other agents even when they are pursuing multiple goals concurrently, and has similar performance to agents which know the goals of other agents a priori.

Original languageEnglish
Title of host publicationProceedings of the 32nd International Joint Conference on Artificial Intelligence, IJCAI 2023
EditorsEdith Elkind
PublisherInternational Joint Conferences on Artificial Intelligence
Pages91-99
Number of pages9
ISBN (Electronic)9781956792034
Publication statusPublished - 2023
Event32nd International Joint Conference on Artificial Intelligence, IJCAI 2023 - Macao, China
Duration: 19 Aug 202325 Aug 2023

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
Volume2023-August
ISSN (Print)1045-0823

Conference

Conference32nd International Joint Conference on Artificial Intelligence, IJCAI 2023
Country/TerritoryChina
CityMacao
Period19/08/2325/08/23

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Multi-Agent Intention Recognition and Progression'. Together they form a unique fingerprint.

Cite this