Title
Regularizing Deep Neural Networks by Ensemble-based Low-Level Sample-Variances Method
Abstract
Deep Neural Networks (DNNs) with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Till now, many regularizers such as dropout, data augmentation have been proposed to prevent overfitting. Motivated by ensemble learning, we treat each hidden layer in neural networks as an ensemble of some base learners by dividing hidden units into some non-overlapping groups and each group is considered as a base learner. Based on the theoretical analysis of generalization error of ensemble estimators (bias-variance-covariance decomposition), we find the variance of each base learner plays an important role in preventing overfitting and propose a novel regularizer---\emphEnsemble-based Low-Level Sample-Variances Method (ELSM) to encourage each base learner of hidden layers to have a low-level sample-variance. Experiments across a number of datasets and network architectures show that ELSM can effectively reduce overfitting and improve the generalization ability of DNNs.
Year
DOI
Venue
2019
10.1145/3357384.3357921
Proceedings of the 28th ACM International Conference on Information and Knowledge Management
Keywords
Field
DocType
bias-variance-covariance decomposition, ensemble learning, generalization ability, neural networks
Data mining,Computer science,Deep neural networks
Conference
ISBN
Citations 
PageRank 
978-1-4503-6976-3
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Shuai Yao100.68
Yuexian Hou226938.59
Liangzhu Ge300.68
Zeting Hu400.34