Skip to content

bigscience-workshop/multilingual-modeling-1

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

README

Installation

  1. Install the packages from composable-sft.
  2. Install the packages from rational_activations. You would need to follow the [Other CUDA/PyTorch] section for installation.
  3. Uninstall transformers using the command pip uninstall transformers.
  4. Install the packages from this repo using pip install -r requirements.txt.

Language Adaptations

See the scripts in lang_adapt/scripts/run_clm_* as examples for the following language adaptation strategies:

  • Adaptable adapters
  • BitFit
  • Continual Pretraining
  • IA3
  • LoRA
  • MAD-X
  • Pfeiffer
  • Pretraining from Scratch
  • Composable SFT

WikiANN

See the scripts in scripts/eval/scripts_wikiann/pilot_*_.sh as examples for evaluating the adapted models on WikiANN.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 76.1%
  • Shell 23.9%