Title
Robust learning of low-dimensional dynamics from large neural ensembles.
Abstract
Recordings from large populations of neurons make it possible to search for hypothesized low-dimensional dynamics. Finding these dynamics requires models that take into account biophysical constraints and can be fit efficiently and robustly. Here, we present an approach to dimensionality reduction for neural data that is convex, does not make strong assumptions about dynamics, does not require averaging over many trials and is extensible to more complex statistical models that combine local and global influences. The results can be combined with spectral methods to learn dynamical systems models. The basic method can be seen as an extension of PCA to the exponential family using nuclear norm minimization. We evaluate the effectiveness of this method using an exact decomposition of the Bregman divergence that is analogous to variance explained for PCA. We show on model data that the parameters of latent linear dynamical systems can be recovered, and that even if the dynamics are not stationary we can still recover the true latent subspace. We also demonstrate an extension of nuclear norm minimization that can separate sparse local connections from global latent dynamics. Finally, we demonstrate improved prediction on real neural data from monkey motor cortex compared to fitting linear dynamical models without nuclear norm smoothing.
Year
Venue
Field
2013
NIPS
Linear dynamical system,Dimensionality reduction,Subspace topology,Computer science,Matrix norm,Smoothing,Dynamical systems theory,Artificial intelligence,Statistical model,Bregman divergence,Machine learning
DocType
Citations 
PageRank 
Conference
9
0.67
References 
Authors
15
3
Name
Order
Citations
PageRank
Pfau, David1806.76
Eftychios A. Pnevmatikakis2808.86
Liam Paninski392699.30