This is an implementation of the paper Auto-Encoding with Variational Bayes by D.P.Kingma and M.Welling. This code follows pytorch/examples/vae.
pip install -r requirements.txt
python main.py --workdir=/tmp/mnist --config=configs/default.py
This VAE example allows specifying a hyperparameter configuration by the means of
setting --config
flag. Configuration flag is defined using
config_flags.
config_flags
allows overriding configuration fields. This can be done as
follows:
python main.py \
--workdir=/tmp/mnist --config=configs/default.py \
--config.learning_rate=0.01 --config.num_epochs=10
If you run the code by above command, you can get some generated images:
and reconstructions of test set digits:
The test set loss after 10 epochs should be around 104
.