Title
Efficient inference for time-varying behavior during learning.
Abstract
The process of learning new behaviors over time is a problem of great interest in both neuroscience and artificial intelligence. However, most standard analyses of animal training data either treat behavior as fixed or track only coarse performance statistics (e.g., accuracy, bias), providing limited insight into the evolution of the policies governing behavior. To overcome these limitations, we propose a dynamic psychophysical model that efficiently tracks trial-to-trial changes in behavior over the course of training. Our model consists of a dynamic logistic regression model, parametrized by a set of time-varying weights that express dependence on sensory stimuli as well as task-irrelevant covariates, such as stimulus, choice, and answer history. Our implementation scales to large behavioral datasets, allowing us to infer 500K parameters (e.g., 10 weights over 50K trials) in minutes on a desktop computer. We optimize hyperparameters governing how rapidly each weight evolves over time using the decoupled Laplace approximation, an efficient method for maximizing marginal likelihood in non-conjugate models. To illustrate performance, we apply our method to psychophysical data from both rats and human subjects learning a delayed sensory discrimination task. The model successfully tracks the psychophysical weights of rats over the course of training, capturing day-to-day and trial-to-trial fluctuations that underlie changes in performance, choice bias, and dependencies on task history. Finally, we investigate why rats frequently make mistakes on easy trials, and suggest that apparent lapses can be explained by sub-optimal weighting of known task covariates.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
over time,artificial intelligence,the model,marginal likelihood,sensory discrimination,the process,human subjects,desktop computer
Field
DocType
Volume
Covariate,Weighting,Hyperparameter,Inference,Computer science,Laplace's method,Marginal likelihood,Weight,Artificial intelligence,Logistic regression,Machine learning
Conference
31
ISSN
Citations 
PageRank 
1049-5258
1
0.41
References 
Authors
0
5
Name
Order
Citations
PageRank
Nicholas Roy13644288.27
Ji Hyun Bak210.41
Athena Akrami310.75
Carlos D. Brody4243.98
Pillow, Jonathan W.534639.95