Skip to content

seongminp/transformers-into-vaes

Folders and files

NameName
Last commit message
Last commit date

Latest commit

e2edc22 · May 26, 2022

History

13 Commits
Aug 5, 2021
Aug 5, 2021
May 26, 2022
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Oct 18, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021
Aug 5, 2021

Repository files navigation

transformers-into-vaes

Code for Finetuning Pretrained Transformers into Variational Autoencoders, presented at Insights Workshop @ EMNLP 2021.

Gathering data used in the paper:

  1. Download all data (penn, snli, yahoo, yelp) from this repository.

  2. Change data path in base_models.py accordingly.

Running experiments:

  1. Install dependencies.
pip install -r requirements.txt
  1. Run phase 1 (encoder only training):
./run_encoder_training snli
  1. Run phase 2 (full training):
./run_training snli <path_to_checkpoint_from_phase_1>

Calculating metrics:

python evaluate_all.py -d snli -bs 256 -c <path_to_config_file> -ckpt <path_to_checkpoint_file> 

About

Code for "Finetuning Pretrained Transformers into Variational Autoencoders"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published