Progressive Growing of GANs for Improved Quality, Stability, and Variation

Tero Karras, Timo Aila, Samuli Laine, Jaakko Lehtinen

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsProfessional

2788 Citations (Scopus)

Abstract

We describe a new training methodology for generative adversarial networks. The key idea is to grow both the generator and discriminator progressively: starting from a low resolution, we add new layers that model increasingly fine details as training progresses. This both speeds the training up and greatly stabilizes it, allowing us to produce images of unprecedented quality, e.g., CelebA images at 1024^2. We also propose a simple way to increase the variation in generated images, and achieve a record inception score of 8.80 in unsupervised CIFAR10. Additionally, we describe several implementation details that are important for discouraging unhealthy competition between the generator and discriminator. Finally, we suggest a new metric for evaluating GAN results, both in terms of image quality and variation. As an additional contribution, we construct a higher-quality version of the CelebA dataset.
Original languageEnglish
Title of host publicationProceedings of International Conference on Learning Representations (ICLR) 2018
Number of pages26
Publication statusPublished - 2018
MoE publication typeD3 Professional conference proceedings
EventInternational Conference on Learning Representations - Vancouver, Canada
Duration: 30 Apr 20183 May 2018
Conference number: 6
https://iclr.cc/Conferences/2018

Conference

ConferenceInternational Conference on Learning Representations
Abbreviated titleICLR
Country/TerritoryCanada
CityVancouver
Period30/04/201803/05/2018
Internet address

Keywords

  • generative adversarial networks
  • unsupervised learning
  • hierarchical methods

Fingerprint

Dive into the research topics of 'Progressive Growing of GANs for Improved Quality, Stability, and Variation'. Together they form a unique fingerprint.

Cite this