Title
Online Normalization for Training Neural Networks.
Abstract
Online Normalization is a new technique for normalizing the hidden activations of a neural network. Like Batch Normalization, it normalizes the sample dimension. While Online Normalization does not use batches, it is as accurate as Batch Normalization. We resolve a theoretical limitation of Batch Normalization by introducing an unbiased technique for computing the gradient of normalized activations. Online Normalization works with automatic differentiation by adding statistical normalization as a primitive. This technique can be used in cases not covered by some other normalizers, such as recurrent networks, fully connected networks, and networks with activation memory requirements prohibitive for batching. We show its applications to image classification, image segmentation, and language modeling. We present formal proofs and experimental results on ImageNet, CIFAR, and PTB datasets.
Year
Venue
Keywords
2019
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
image classification,neural networks,language model,image segmentation
Field
DocType
Volume
Normalization (statistics),Automatic differentiation,Image segmentation,Mathematical proof,Artificial intelligence,Contextual image classification,Artificial neural network,Mathematics,Machine learning,Language model
Journal
32
ISSN
Citations 
PageRank 
1049-5258
0
0.34
References 
Authors
0
8