Variational autoencoders as mixins.
This repo contains implementation of variational autoencoder (VAE) and variants in PyTorch as mixin classes, which can be reused and composed in your customized modules.
Check the docs here.
An example using simple encoder and decoder on the MNIST dataset is in example.py.
Mixin is a term in object-oriented programming.
Implemented VAEs:
- VAE
- beta-VAE
- InfoVAE
- DIP-VAE
-
$\beta$ -TCVAE - VQ-VAE
Losses are averaged across samples, and summed along each latent vector in a minibatch.