Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Distillation enhancement #463

Open
wants to merge 17 commits into
base: main
Choose a base branch
from

Conversation

HIT-cwh
Copy link
Collaborator

@HIT-cwh HIT-cwh commented Feb 23, 2023

Modification

  1. Support distillation early stop.
  2. Add norm connector and support the config of connectors as a list.
  3. Add fgd loss.
  4. Fix bugs in compute_distill_losses. If there is no data recorded by recorders, the distillation loss in the current iteration is zero.
  5. Deepcopy output/input in deliveries.
  6. Enhance the kl divergence by using weighted loss.

@codecov
Copy link

codecov bot commented Feb 23, 2023

Codecov Report

Base: 79.86% // Head: 79.21% // Decreases project coverage by -0.65% ⚠️

Coverage data is based on head (b479dc5) compared to base (a27952d).
Patch coverage: 85.59% of modified lines in pull request are covered.

Additional details and impacted files
@@             Coverage Diff             @@
##           dev-1.x     #463      +/-   ##
===========================================
- Coverage    79.86%   79.21%   -0.65%     
===========================================
  Files          251      278      +27     
  Lines        12756    13762    +1006     
  Branches      1963     2098     +135     
===========================================
+ Hits         10187    10902     +715     
- Misses        2159     2413     +254     
- Partials       410      447      +37     
Flag Coverage Δ
unittests 79.21% <85.59%> (-0.65%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...rithms/distill/configurable/fpn_teacher_distill.py 38.46% <0.00%> (-1.54%) ⬇️
mmrazor/models/losses/utils.py 51.51% <51.51%> (ø)
...hms/distill/configurable/single_teacher_distill.py 76.81% <62.50%> (-1.32%) ⬇️
...mrazor/models/distillers/configurable_distiller.py 89.71% <75.00%> (-3.76%) ⬇️
mmrazor/models/algorithms/base.py 93.75% <78.57%> (-6.25%) ⬇️
...azor/engine/hooks/distillation_loss_detach_hook.py 87.50% <87.50%> (ø)
mmrazor/models/losses/fgd_loss.py 97.41% <97.41%> (ø)
mmrazor/engine/__init__.py 100.00% <100.00%> (ø)
mmrazor/engine/hooks/__init__.py 100.00% <100.00%> (ø)
...mrazor/models/architectures/connectors/__init__.py 100.00% <100.00%> (ø)
... and 38 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant