Title
Coding efficiency and detectability of rate fluctuations with non-Poisson neuronal firing.
Abstract
Statistical features of neuronal spike trains are known to be non-Poisson. Here, we investigate the extent to which the non-Poissonian feature affects the efficiency of transmitting information on fluctuating firing rates. For this purpose, we introduce the Kullbuck-Leibler (KL) divergence as a measure of the efficiency of information encoding, and assume that spike trains are generated by time-rescaled renewal processes. We show that the KL divergence determines the lower bound of the degree of rate fluctuations below which the temporal variation of the firing rates is undetectable from sparse data. We also show that the KL divergence, as well as the lower bound, depends not only on the variability of spikes in terms of the coefficient of variation, but also significantly on the higher-order moments of interspike interval (ISI) distributions. We examine three specific models that are commonly used for describing the stochastic nature of spikes (the gamma, inverse Gaussian (IG) and lognormal ISI distributions), and find that the time-rescaled renewal process with the IG distribution achieves the largest KL divergence, followed by the lognormal and gamma distributions.
Year
Venue
Field
2012
NIPS
Divergence,Inverse Gaussian distribution,Renewal theory,Upper and lower bounds,Artificial intelligence,Gamma distribution,Poisson distribution,Log-normal distribution,Mathematics,Machine learning,Kullback–Leibler divergence
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
9
1
Name
Order
Citations
PageRank
Shinsuke Koyama1948.84