Title
Semi-parametric training of autoencoders with Gaussian kernel smoothed topology learning neural networks
Abstract
Autoencoders are essential for training multi-hidden layer neural networks. Parametric autoencoder trainings often require user selections of hidden neuron numbers and kernel types. In this paper, a semi-parametric autoencoder training method based on self-organized learning and incremental learning is proposed. The cost function is constructed incrementally by nonparametric learning, and the model parameter is trained by parametric learning. First, a topology learning neural network such as growing neural gas or self-organizing incremental neural network is trained to obtain a discrete representation of the training data. Second, the correlations between different dimensions are modeled as a joint distribution by the neural network representation and kernel smoothers. Finally, the loss function is defined to be the regression prediction errors with each dimension as a response variable in density regression. The parameter of kernels is selected by gradient descent which minimizes the reconstruction error on a data subset. The proposed architecture has the advantage of high training space efficiency because of incremental training, and the advantage of automated selection of hidden neuron numbers. Experiments are carried out on 4 UCI datasets and an image interpolation task. Results show that the proposed methods outperform the perceptron architecture autoencoders and the restricted Boltzmann machine in the task of nonlinear feature learning.
Year
DOI
Venue
2020
10.1007/s00521-018-3897-z
Neural Computing and Applications
Keywords
DocType
Volume
Autoencoder, Nonparametric learning, Kernel density estimation, Incremental learning
Journal
32
Issue
ISSN
Citations 
9
1433-3058
0
PageRank 
References 
Authors
0.34
20
5
Name
Order
Citations
PageRank
Zhiyang Xiang100.34
Changshou Deng23910.80
Xueting Xiang300.34
Mali Yu400.34
Jing Xiong5187.00