Learning Translations: Emergent Communication Pretraining for Cooperative Language Acquisition
Learning Translations: Emergent Communication Pretraining for Cooperative Language Acquisition
Dylan Cope, Peter McBurney
Proceedings of the Thirty-Third International Joint Conference on Artificial Intelligence
Main Track. Pages 40-48.
https://doi.org/10.24963/ijcai.2024/5
In Emergent Communication (EC) agents learn to communicate with one another, but the protocols that they develop are specialised to their training community. This observation led to research into Zero-Shot Coordination (ZSC) for learning communication strategies that are robust to agents not encountered during training. However, ZSC typically assumes that no prior data is available about the agents that will be encountered in the zero-shot setting. In many cases, this presents an unnecessarily hard problem and rules out communication via preestablished conventions. We propose a novel AI challenge called a Cooperative Language Acquisition Problem (CLAP) in which the ZSC assumptions are relaxed by allowing a 'joiner' agent to learn from a dataset of interactions between agents in a target community. We propose and compare two methods for solving CLAPs: Behaviour Cloning (BC), and Emergent Communication pretraining and Translation Learning (ECTL), in which an agent is trained in self-play with EC and then learns to translate between an emergent protocol and the target community's protocol.
Keywords:
Agent-based and Multi-agent Systems: MAS: Agent communication
Agent-based and Multi-agent Systems: MAS: Agent-based simulation and emergence
Agent-based and Multi-agent Systems: MAS: Coordination and cooperation
Machine Learning: ML: Multiagent Reinforcement Learning