Skip to content
/ gmvae Public

Implementation of a Gaussian Mixture Variational Autoencoder (GMVAE).

License

Notifications You must be signed in to change notification settings

mazrk7/gmvae

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GMVAE

This repository contains a TensorFlow implementation of an unsupervised Gaussian Mixture Variational Autoencoder (GMVAE) on the MNIST dataset, specifically making use of the Probability library.

There are currently three models in use:

  • VAE is a standard implementation of the Variational Autoencoder, with no convolutional layers
  • VAE_GMP is an adaptation of VAE to make use of a Gaussian Mixture prior, instead of a standard Normal distribution
  • GMVAE is an attempt to replicate the work described in this blog and inspired from this paper

The directory layout is as follows:

  • bin: Bash example scripts for running the aforementioned models
  • checkpoints: Directories to save checkpoints of trained model states
  • scripts: TensorFlow scripts to implement the models and run them using the main run_gmvae.py script, alongside other helpful modules (helpers.py and base.py)

Note: This is a work in progress, so any contributions/feedback will be well-received.

Dependencies

About

Implementation of a Gaussian Mixture Variational Autoencoder (GMVAE).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published