Abstract | ||
---|---|---|
We present the lifted proximal operator machine (LPOM) to train fully-connected feed-forward neural networks. LPOM represents the activation function as an equivalent proximal operator and adds the proximal operators to the objective function of a network as penalties. LPOM is block multi-convex in all layer-wise weights and activations. This allows us to develop a new block coordinate descent (BC... |
Year | DOI | Venue |
---|---|---|
2022 | 10.1109/TPAMI.2020.3048430 | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Keywords | DocType | Volume |
Training,Artificial neural networks,Linear programming,Convergence,Tuning,Standards,Patents | Journal | 44 |
Issue | ISSN | Citations |
6 | 0162-8828 | 0 |
PageRank | References | Authors |
0.34 | 0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jia Li | 1 | 13 | 3.33 |
Mingqing Xiao | 2 | 4 | 4.10 |
Cong Fang | 3 | 17 | 7.14 |
Yue Dai | 4 | 0 | 0.34 |
Chao Xu | 5 | 1327 | 62.65 |
Zhouchen Lin | 6 | 4805 | 203.69 |