Skip to content
/ LLMCL Public
forked from which47/LLMCL

Analyzing and Reducing Catastrophic Forgetting in Parameter Efficient Tuning

Notifications You must be signed in to change notification settings

2proveit/LLMCL

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLMCL

Analyzing and Reducing Catastrophic Forgetting in Parameter Efficient Tuning

Overview

LLMCL is a repository based on the Hugging Face Transformers library, designed to assess the continuous learning capability of large language models. Through this repository, users can easily customize datasets, specify models, and experiment with existing classical continuous learning methods.

Key Features

  • Continual Learning Methods: The repository includes several classical continuous learning methods for users to reference and use.
  • Model Customization: You can easily customize the model you want to use, and the repository will automatically download the corresponding model.

Quick Start

1.Install dependencies

conda create -n llmcl python==3.10
pip install -r requirements.txt

2.Start Training

./scripts/train_seq.sh

3.Inference

./scripts/infer_seq.sh

4. customize

You can easily customize scripts for your own use:

  • Ensure your dataset is organized in JSON format with prompt and answer as keys.
  • Save the dataset file to <DATA_PATH>/<DATASET_NAME>/<SPLIT>.json
  • For more details, refer to the get_dataset.py file.

Reproduce

To Reproduce our results, you need
1. Request the access to llama2 model and download TRACE Benchmark , MedMCQA,JEC-QA to ./data_files folder.

2.run scripts customize your training scripts and run it.

Citation

If you find this repository helpful, please consider citing our work.

@misc{ren2024analyzing,
      title={Analyzing and Reducing Catastrophic Forgetting in Parameter Efficient Tuning}, 
      author={Weijieying Ren and Xinlong Li and Lei Wang and Tianxiang Zhao and Wei Qin},
      year={2024},
      eprint={2402.18865},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

Analyzing and Reducing Catastrophic Forgetting in Parameter Efficient Tuning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.0%
  • Shell 1.0%