Abstract | ||
---|---|---|
We present a comprehensive study of multilayer neural networks with binary activation, relying on the PAC-Bayesian theory. Our contributions are twofold: (i) we develop an end-to-end framework to train a binary activated deep neural network, (ii) we provide nonvacuous PAC-Bayesian generalization bounds for binary activated deep neural networks. Our results are obtained by minimizing the expected loss of an architecture-dependent aggregation of binary activated deep neural networks. Our analysis inherently overcomes the fact that binary activation function is non-differentiable. The performance of our approach is assessed on a thorough numerical experiment protocol on real-life datasets. |
Year | Venue | Keywords |
---|---|---|
2019 | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019) | bayesian theory |
Field | DocType | Volume |
Expected loss,Activation function,Differentiable function,Artificial intelligence,Artificial neural network,Deep neural networks,Mathematics,Machine learning,Binary number,Bayesian probability | Journal | 32 |
ISSN | Citations | PageRank |
1049-5258 | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Gaël Letarte | 1 | 0 | 0.34 |
Pascal Germain | 2 | 657 | 27.49 |
Benjamin Guedj | 3 | 9 | 8.82 |
François Laviolette | 4 | 1036 | 65.93 |