Title
Learning a common language through an emergent interaction topology
Abstract
We study the effects of various emergent topologies of interaction on the rate of language convergence in a population of communicating agents. The agents generate, parse, and learn sentences from each other using recurrent neural networks. An agent chooses another agent to learn from, based on that agent's fitness. Fitness is defined to include a frequency-dependent term capturing the approximate number of interactions an agent has had with others---its "popularity" as a teacher. This method of frequency-dependent selection is based on our earlier Noisy Preferential Attachment algorithm, which has been shown to produce various network topologies, including scale-free and small-world networks. We show that convergence occurs much more quickly with this strategy than it does for uniformly random interactions. In addition, this strategy more closely represents choice preference dynamics in large natural populations, and so may be more realistic as a model for adaptive language.
Year
DOI
Venue
2006
10.1145/1160633.1160891
AAMAS
Keywords
Field
DocType
adaptive language,language convergence,common language,earlier noisy preferential attachment,approximate number,various network topology,frequency-dependent term,large natural population,various emergent topology,emergent interaction topology,frequency-dependent selection,choice preference dynamic,neural network,scale free network,scale free networks,scale free,network topology,frequency dependent selection,small world network
Convergence (routing),Population,Computer science,Popularity,Recurrent neural network,Network topology,Scale-free network,Artificial intelligence,Parsing,Preferential attachment,Machine learning
Conference
ISBN
Citations 
PageRank 
1-59593-303-4
0
0.34
References 
Authors
5
3
Name
Order
Citations
PageRank
Samarth Swarup121328.37
Kiran Lakkaraju244536.90
Les Gasser31601261.00