Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Unsupervised Controllable Generation with Self-Training
 
conference paper

Unsupervised Controllable Generation with Self-Training

Chrysos, Grigorios G.  
•
Kossaifi, Jean
•
Yu, Zhiding
Show more
January 1, 2021
Proceedings of 2021 IEEE International Joint Conference On Neural Networks (Ijcnn)
International Joint Conference on Neural Networks (IJCNN 2021)

Recent generative adversarial networks (GANs) are able to generate impressive photo-realistic images. However, controllable generation with GANs remains an open research problem. Achieving controllable generation requires semantically interpretable and disentangled factors of variation. It is challenging to achieve this goal using simple fixed distributions such as Gaussian distribution. Instead, we propose an unsupervised framework to learn a distribution of latent codes that control the generator through self-training. Self-training provides an iterative feedback in the GAN training, from the discriminator to the generator, and progressively improves the proposal of the latent codes as training proceeds. The latent codes are sampled from a latent variable model that is learned in the feature space of the discriminator. We consider a normalized independent component analysis model and learn its parameters through tensor factorization of the higher-order moments. Our framework exhibits better disentanglement compared to other variants such as the variational autoencoder, and is able to discover semantically meaningful latent codes without any supervision. We empirically demonstrate on both cars and faces datasets that each group of elements in the learned code controls a mode of variation with a semantic meaning, e.g. pose or background change. We also demonstrate with quantitative metrics that our method generates better results compared to other approaches.

  • Details
  • Metrics
Type
conference paper
DOI
10.1109/IJCNN52387.2021.9534045
Web of Science ID

WOS:000722581705123

Author(s)
Chrysos, Grigorios G.  
Kossaifi, Jean
Yu, Zhiding
Anandkumar, Anima
Date Issued

2021-01-01

Publisher

IEEE

Publisher place

New York

Published in
Proceedings of 2021 IEEE International Joint Conference On Neural Networks (Ijcnn)
ISBN of the book

978-1-6654-3900-8

Series title/Series vol.

IEEE International Joint Conference on Neural Networks (IJCNN)

Subjects

Computer Science, Artificial Intelligence

•

Computer Science, Hardware & Architecture

•

Engineering, Electrical & Electronic

•

Computer Science

•

Engineering

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LIONS  
Event nameEvent placeEvent date
International Joint Conference on Neural Networks (IJCNN 2021)

Shenzhen, China

Jul 18-22, 2021

Available on Infoscience
January 15, 2022
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/184545
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés