Title
Kernel-Based Approaches for Sequence Modeling: Connections to Neural Methods
Abstract
We investigate time-dependent data analysis from the perspective of recurrent kernel machines, from which models with hidden units and gated memory cells arise naturally. By considering dynamic gating of the memory cell, a model closely related to the long short-term memory (LSTM) recurrent neural network is derived. Extending this setup to n-gram filters, the convolutional neural network (CNN), Gated CNN, and recurrent additive network (RAN) are also recovered as special cases. Our analysis provides a new perspective on the LSTM, while also extending it to n-gram convolutional filters. Experiments(1) are performed on natural language processing tasks and on analysis of local field potentials (neuroscience). We demonstrate that the variants we derive from kernels perform on par or even better than traditional neural methods. For the neuroscience application, the new models demonstrate significant improvements relative to the prior state of the art.
Year
Venue
Keywords
2019
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
convolutional neural network,recurrent neural network,local field potentials
Field
DocType
Volume
Kernel (linear algebra),Computer science,Sequence modeling,Artificial intelligence,Machine learning
Conference
32
ISSN
Citations 
PageRank 
1049-5258
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Kevin J. Liang124.42
Guoyin Wang2247.38
Yitong Li3447.98
Ricardo Henao428623.85
L. Carin54603339.36