Title
A new look at state-space models for neural data.
Abstract
State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in state-space models with non-Gaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation, and inference of synaptic properties. Along the way, we point out connections to some related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the state-space setting at all; instead, the key property we are exploiting involves the bandedness of certain matrices. We close by discussing some applications of this more general point of view, including Markov chain Monte Carlo methods for neural decoding and efficient estimation of spatially-varying firing rates.
Year
DOI
Venue
2010
10.1007/s10827-009-0179-x
Journal of Computational Neuroscience
Keywords
Field
DocType
Neural coding,State-space models,Hidden Markov model,Tridiagonal matrix
Mathematical optimization,Markov chain Monte Carlo,Spike train,Neural coding,Inference,Smoothing,Neural decoding,Artificial intelligence,Hidden Markov model,State space,Machine learning,Mathematics
Journal
Volume
Issue
ISSN
29
1-2
1573-6873
Citations 
PageRank 
References 
40
2.65
34
Authors
8
Name
Order
Citations
PageRank
Liam Paninski192699.30
Yashar Ahmadian21168.29
Daniel Gil Ferreira3523.62
Shinsuke Koyama4948.84
Kamiar Rahnama Rad5927.38
Michael Vidne6815.31
Joshua T. Vogelstein727331.99
Wu, Wei8777.13