Abstract | ||
---|---|---|
This paper presents a learning system based on Artificial Life for the animation of virtual entities. The model uses an extension of a classifiers system to build dynamically the behavior of agents by emergence. A behavior is selected into a set of binary rules that evolves continuously to ensure the maximization of predefined goals. The reinforcement allows to reward a rule and then to evaluate its efficiency faced to a given context. We investigate the interaction between virtual agents and a human controlled clone immersed in virtual soccer. In the simulation, each entity evolves in real-time by using the ability of cooperation and communication with teammates. We evaluate the benefits of the communication inside a team and present how it can improve the learning of a group thanks to a rule-sharing and a human intervention. |
Year | DOI | Venue |
---|---|---|
2000 | 10.1007/3-540-45016-5_14 | Virtual Worlds |
Keywords | Field | DocType |
predefined goal,virtual agent,entity evolves,classifiers system,virtual entity,learning agents,human controlled clone,human intervention,binary rule,virtual soccer,artificial life,real time | Artificial life,Virtual reality,Virtual machine,Computer science,Image synthesis,Animation,Artificial intelligence,Computer animation,Maximization | Conference |
Volume | ISSN | ISBN |
1834 | 0302-9743 | 3-540-67707-0 |
Citations | PageRank | References |
1 | 0.37 | 16 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Cédric Sanza | 1 | 32 | 3.81 |
Cyril Panatier | 2 | 1 | 0.37 |
Yves Duthen | 3 | 165 | 26.63 |