arxivst stuff from arxiv that you should probably bookmark

On the effect of Batch Normalization and Weight Normalization in Generative Adversarial Networks

Abstract · Apr 13, 2017 02:15 ·

gans training weight batch gan samples found quality normalization stat-ml cs-cv cs-lg

Arxiv Abstract

  • Sitao Xiang
  • Hao Li

As in many neural network architectures, the use of Batch Normalization (BN) has become a common practice for Generative Adversarial Networks (GAN). In this paper, we propose using Euclidean reconstruction error on a test set for evaluating the quality of GANs. Under this measure, together with a careful visual analysis of generated samples, we found that while being able to speed training during early stages, BN may have negative effects on the quality of the trained model and the stability of the training process. Furthermore, Weight Normalization, a more recently proposed technique, is found to improve the reconstruction, training speed and especially the stability of GANs, and thus should be used in place of BN in GAN training.

Read the paper (pdf) »