Title
Probabilistic consensus in Markovian multi-agent networks
Abstract
This paper addresses the probabilistic consensus problem in a network of Markovian agents. The dynamics of each agent ismodeled as a finite-state Markov chain, with transition rates that are affected by the communication with the neighbors, so inducing an emulation effect. Consensus is reached when all the agent probability vectors converge to a common steady-state probability vector. The main result of the paper is the proof of consensus for communication networks described by either a complete graph or a star-topology graph. These results are also important in a network control perspective, as some parameters of the network model could be used as tuning knobs to steer the steady-state consensus wherever desired.
Year
DOI
Venue
2014
10.1109/ECC.2014.6862254
ECC
Keywords
Field
DocType
markov processes,graph theory,multi-agent systems,networked control systems,probability,markovian multiagent networks,agent probability vectors,common steady-state probability vector,communication networks,complete graph,emulation effect,finite-state markov chain,network control perspective,network model,probabilistic consensus problem,star-topology graph,transition rates,probabilistic logic,multi agent systems,probability distribution,vectors,steady state,mathematical model
Consensus,Additive Markov chain,Continuous-time Markov chain,Theoretical computer science,Hammersley–Clifford theorem,Bayesian network,Artificial intelligence,Probability vector,Probabilistic logic,Graphical model,Machine learning,Mathematics
Conference
Citations 
PageRank 
References 
1
0.37
1
Authors
4
Name
Order
Citations
PageRank
P. Bolzern19917.69
Cerotti, D.210.37
Patrizio Colaneri3449.19
Marco Gribaudo4252.36