Skip to content

Latest commit

 

History

History
51 lines (51 loc) · 1.81 KB

2022-11-28-li22a.md

File metadata and controls

51 lines (51 loc) · 1.81 KB
title abstract video layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Energy-Based Models for Continual Learning
We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems. Instead of tackling continual learning via the use of external memory, growing models, or regularization, EBMs change the underlying training objective to cause less interference with previously learned information. Our proposed version of EBMs for continual learning is simple, efficient, and outperforms baseline methods by a large margin on several benchmarks. Moreover, our proposed contrastive divergence-based training objective can be combined with other continual learning methods, resulting in substantial boosts in their performance. We further show that EBMs are adaptable to a more general continual learning setting where the data distribution changes without the notion of explicitly delineated tasks. These observations point towards EBMs as a useful building block for future continual learning methods.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
li22a
0
Energy-Based Models for Continual Learning
1
22
1-22
1
false
Li, Shuang and Du, Yilun and van de Ven, Gido and Mordatch, Igor
given family
Shuang
Li
given family
Yilun
Du
given family prefix
Gido
Ven
van de
given family
Igor
Mordatch
2022-11-28
Proceedings of The 1st Conference on Lifelong Learning Agents
199
inproceedings
date-parts
2022
11
28