Chainer implementations of InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets http://arxiv.org/abs/1606.03657.
- c1 ~ Cat(K = 10, p = 0.1)
- c2 ~ Unif(-1, 1)
- c3 ~ Unif(-1, 1)
10 categorical and 2 continuous codes are sampled and concatenated with 62 noise variables z and then fed into a generator. Following are sample images created by a generator trained over 100 epochs on the MNIST training dataset consisting of 60000 samples.
Each row shows 10 random images generated from the same one-hot categorical vector. K = 10 and hence such 10 rows, one for each categorical code. The network learns the latent representations of the 10 different digits with some exceptions such as the digit 1.
Interpolating over c2 shows that the network learns to distinguish the width of the digit.
Interpolating over c3 shows that the network learns the orientations.
python train.py --out-generator-filename ./trained/generator.model --gpu 0