Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis. | 0 | 0.34 | 2019 |
Lipschitz-Margin Training: Scalable Certification of Perturbation Invariance for Deep Neural Networks. | 12 | 0.50 | 2018 |
Variance-based Gradient Compression for Efficient Distributed Deep Learning. | 3 | 0.44 | 2018 |
On the Structural Sensitivity of Deep Convolutional Networks to the Directions of Fourier Basis Functions | 0 | 0.34 | 2018 |