-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix ema checkpoint loading #156
Conversation
Codecov ReportAttention: Patch coverage is
✅ All tests successful. No failed tests found.
Additional details and impacted files@@ Coverage Diff @@
## main #156 +/- ##
==========================================
- Coverage 96.31% 95.76% -0.56%
==========================================
Files 147 170 +23
Lines 6304 7552 +1248
==========================================
+ Hits 6072 7232 +1160
- Misses 232 320 +88 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
We already support loading hyperparameters such as
lr_schedulers
,epoch
,global_step
,optimizer_steps
, andstate_dict
(including EMA weights if used) via the--resume
flag in thetrain
command. This PR:on_load_checkpoint
hook, which is called before theon_fit_start
hook.trainer.resume_training
to resume training from checkpoints specified in the config file (model.weights).