Skip to content

Latest commit

 

History

History
14 lines (9 loc) · 751 Bytes

README.md

File metadata and controls

14 lines (9 loc) · 751 Bytes

Repository info

This project implements an unsupervised generative modeling technique called Wasserstein Auto-Encoders (WAE), proposed by Tolstikhin, Bousquet, Gelly, Schoelkopf (2017).

Repository structure

wae.py - everything specific to WAE, including encoder-decoder losses, various forms of a distribution matching penalties, and training pipelines

run.py - master script to train a specific model on a selected dataset with specified hyperparameters

Example of output pictures

The following picture shows various characteristics of the WAE-MMD model trained on CelebA after 50 epochs:

WAE-MMD progress