Title
Local Geometry Of Cross Entropy Loss In Learning One-Hidden-Layer Neural Networks
Abstract
We study model recovery for data classification, where the training labels are generated from a one-hidden-layer neural network with sigmoid activations, and the goal is to recover the weights of the neural network. We consider two network models, the fully-connected network (FCN) and the non-overlapping convolutional neural network (CNN). We prove that with Gaussian inputs, the empirical risk based on cross entropy exhibits strong convexity and smoothness uniformly in a local neighborhood of the ground truth, as soon as the sample complexity is sufficiently large. Hence, if initialized in this neighborhood, it establishes the local convergence guarantee for empirical risk minimization using cross entropy via gradient descent for learning one-hidden-layer neural networks, at the near-optimal sample and computational complexity with respect to the network input dimension without unrealistic assumptions such as requiring a fresh set of samples at each iteration.
Year
DOI
Venue
2019
10.1109/ISIT.2019.8849289
2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)
Field
DocType
Citations 
Cross entropy,Discrete mathematics,Gradient descent,Computer science,Convolutional neural network,Empirical risk minimization,Algorithm,Local convergence,Artificial neural network,Network model,Computational complexity theory
Conference
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Haoyu Fu191.81
Yuejie Chi272056.67
Yingbin Liang31646147.64