Title
Compositional languages emerge in a neural iterated learning model
Abstract
The principle of compositionality, which enables natural language to represent complex concepts via a structured combination of simpler ones, allows us to convey an open-ended set of messages using a limited vocabulary. If compositionality is indeed a natural property of language, we may expect it to appear in communication protocols that are created by neural agents via grounded language learning. Inspired by the iterated learning framework, which simulates the process of language evolution, we propose an effective neural iterated learning algorithm that, when applied to interacting neural agents, facilitates the emergence of a more structured type of language. Indeed, these languages provide specific advantages to neural agents during training, which translates as a larger posterior probability, which is then incrementally amplified via the iterated learning procedure. Our experiments confirm our analysis, and also demonstrate that the emerged languages largely improve the generalization of the neural agent communication.
Year
Venue
Keywords
2020
ICLR
Compositionality, Multi-agent, Emergent language, Iterated learning
DocType
ISSN
Citations 
Conference
ICLR-2020
0
PageRank 
References 
Authors
0.34
16
5
Name
Order
Citations
PageRank
Yi Ren1404.05
Shangmin Guo291.47
Matthieu Labeau3257.40
Shay B. Cohen429829.56
Simon Kirby500.68