Skip to content

Latest commit

 

History

History
60 lines (33 loc) · 1.88 KB

README.rst

File metadata and controls

60 lines (33 loc) · 1.88 KB

Auxiliary Deep Generatives Models

This repository is the implementation of the auxiliary deep generative model presented at the workshop on advances in approximate Bayesian inference, NIPS 2015. The article show state-of-the-art performance on MNIST and will be submitted to ICML 2016 in an extended format where more datasets are included.

The implementation is build on the Parmesan, Lasagne and Theano libraries.

Installation

Please make sure you have installed the requirements before executing the python scripts.

Install

git clone https://github.com/casperkaae/parmesan.git
cd parmesan
python setup.py develop
pip install numpy
pip install seaborn
pip install matplotlib
pip install https://github.com/Theano/Theano/archive/master.zip
pip install https://github.com/Lasagne/Lasagne/archive/master.zip

Examples

The repository primarily includes

  • script running a new model on the MNIST datasets with only 100 labels - run_adgmssl_mnist.py.
  • script evaluating a trained model (see model specifics in output/.) - *run_adgmssl_evaluation.py.
  • iPython notebook where all training is implemented in a single scipt - run_adgmssl_mnist_notebook.ipynb.

Please see the source code and code examples for further details.

Comparison of the training convergence between the adgmssl, adgmssl with deterministic auxiliary units and the dgmssl.

/output/train.png

Showing the information contribution from the auxiliary and the latent units a and z respecively.

/output/diff.png

A random sample from the latent space run through the generative model.

/output/mnist.png