Title
Batch Normalization in the final layer of generative networks.
Abstract
Generative Networks have shown great promise in generating photo-realistic images. Despite this, the theory surrounding them is still an active research area. Much of the useful work with Generative networks rely on heuristics that tend to produce good results. One of these heuristics is the advice not to use Batch Normalization in the final layer of the generator network. Many of the state-of-the-art generative network architectures use this heuristic, but the reasons for doing so are inconsistent. This paper will show that this is not necessarily a good heuristic and that Batch Normalization can be beneficial in the final layer of the generator network either by placing it before the final non-linear activation, usually a $tanh$ or replacing the final $tanh$ activation altogether with Batch Normalization and clipping. We show that this can lead to the faster training of Generator networks by matching the generator to the mean and standard deviation of the target distributionu0027s image colour values.
Year
Venue
Field
2018
arXiv: Computer Vision and Pattern Recognition
Heuristic,Normalization (statistics),Computer science,Network architecture,Heuristics,Artificial intelligence,Generative grammar,Standard deviation,Machine learning
DocType
Volume
Citations 
Journal
abs/1805.07389
1
PageRank 
References 
Authors
0.35
0
2
Name
Order
Citations
PageRank
Sean Mullery110.35
Paul F. Whelan256139.95