Title
Training Neural Networks by Lifted Proximal Operator Machines
Abstract
We present the lifted proximal operator machine (LPOM) to train fully-connected feed-forward neural networks. LPOM represents the activation function as an equivalent proximal operator and adds the proximal operators to the objective function of a network as penalties. LPOM is block multi-convex in all layer-wise weights and activations. This allows us to develop a new block coordinate descent (BC...
Year
DOI
Venue
2022
10.1109/TPAMI.2020.3048430
IEEE Transactions on Pattern Analysis and Machine Intelligence
Keywords
DocType
Volume
Training,Artificial neural networks,Linear programming,Convergence,Tuning,Standards,Patents
Journal
44
Issue
ISSN
Citations 
6
0162-8828
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Jia Li1133.33
Mingqing Xiao244.10
Cong Fang3177.14
Yue Dai400.34
Chao Xu5132762.65
Zhouchen Lin64805203.69