Skip to content

intelligent-control-lab/Feedforward_Adaptation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Online Model Adaptation with Feedforward Compensation

This repository contains the code for Feedforward and Online Model Adaptation, as demonstrated in the following papers:

Abulikemu Abuduweili, and Changliu Liu, "Online Model Adaptation with Feedforward Compensation," CoRL, 2023.

Spotlight Video: here

Abstract

To cope with distribution shifts or non-stationarity in system dynamics, online adaptation algorithms have been introduced to update offline-learned prediction models in real-time. Existing online adaptation methods focus on optimizing the prediction model by utilizing feedback from the latest prediction error. Unfortunately, this feedback-based approach is susceptible to forgetting past information. This work proposes an online adaptation method with feedforward compensation, which uses critical data samples from a memory buffer, instead of the latest samples, to optimize the prediction model. We prove that the proposed approach achieves a smaller error bound compared to previously utilized methods in slow time-varying systems. Furthermore, our feedforward adaptation technique is capable of estimating an uncertainty bound for predictions.

About Code

Install Requirments

pip install numpy pandas scikit-learn torch

Training the Model on etth1/ill/exchange:

python train.py --data etth1

Adapting the trained model with Feedforward Adaptation:

python adap.py --data etth1 --adapt sgd --buffer_size 1000

Citation

If you find the code helpful in your research or work, please cite the following papers.

@inproceedings{
abuduweili2023online,
title={Online Model Adaptation with Feedforward Compensation},
author={ABULIKEMU ABUDUWEILI and Changliu Liu},
booktitle={7th Annual Conference on Robot Learning},
year={2023},
}

About

[CoRL 2023] Online Model Adaptation with Feedforward Compensation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages