Title
Regularizing Neural Networks Via Minimizing Hyperspherical Energy
Abstract
Inspired by the Thomson problem in physics where the distribution of multiple propelling electrons on a unit sphere can be modeled via minimizing some potential energy, hyperspherical energy minimization has demonstrated its potential in regularizing neural networks and improving their generalization power. In this paper, we first study the important role that hyperspherical energy plays in neural network training by analyzing its training dynamics. Then we show that naively minimizing hyperspherical energy suffers from some difficulties due to highly non-linear and non-convex optimization as the space dimensionality becomes higher, therefore limiting the potential to further improve the generalization. To address these problems, we propose the compressive minimum hyperspherical energy (CoMHE) as a more effective regularization for neural networks. Specifically, CoMHE utilizes projection mappings to reduce the dimensionality of neurons and minimizes their hyperspherical energy. According to different designs for the projection mapping, we propose several distinct yet well-performing variants and provide some theoretical guarantees to justify their effectiveness. Our experiments show that CoMHE consistently outperforms existing regularization methods, and can be easily applied to different neural networks.
Year
DOI
Venue
2020
10.1109/CVPR42600.2020.00695
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)
DocType
ISSN
Citations 
Conference
1063-6919
0
PageRank 
References 
Authors
0.34
38
8
Name
Order
Citations
PageRank
Rongmei Lin163.46
Weiyang Liu21019.23
Zhen Liu3405.01
Chen Feng411512.67
Zhiding Yu542130.08
James M. Rehg65259474.66
Li Xiong72335142.42
Le Song82437159.27