Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SequentialLR Scheduler #111

Merged
merged 2 commits into from
Oct 18, 2024
Merged

SequentialLR Scheduler #111

merged 2 commits into from
Oct 18, 2024

Conversation

JSabadin
Copy link
Contributor

SequentialLR Support for Training

Overview

This PR adds support for the SequentialLR learning rate scheduler in the training configuration. The SequentialLR scheduler allows for chaining multiple learning rate schedulers, making it possible to implement a warm-up phase followed by a more complex learning rate schedule during training.

Changes

  • SequentialLR Integration: You can now use the SequentialLR scheduler in the training configuration to combine multiple schedulers, such as a warm-up phase (LinearLR) followed by CosineAnnealingLR.
  • Configuration Example:
    scheduler:
      name: SequentialLR
      params:
        schedulers:
          - name: LinearLR               # Warmup phase
            params:
              start_factor: 0.1
              total_iters: 10
          - name: CosineAnnealingLR      # After warmup, switch to Cosine Annealing
            params:
              T_max: 299
              eta_min: 0.00001
        milestones: [10]  # Switch to CosineAnnealingLR after 10 iterations

@JSabadin JSabadin requested a review from a team as a code owner October 10, 2024 10:24
@JSabadin JSabadin requested review from kozlov721, klemen1999, tersekmatija and conorsim and removed request for a team October 10, 2024 10:24
@github-actions github-actions bot added enhancement New feature or request release New version release labels Oct 10, 2024
Copy link

codecov bot commented Oct 10, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Please upload report for BASE (main@fb5f1aa). Learn more about missing BASE report.

✅ All tests successful. No failed tests found.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #111   +/-   ##
=======================================
  Coverage        ?   97.14%           
=======================================
  Files           ?      140           
  Lines           ?     6028           
  Branches        ?        0           
=======================================
  Hits            ?     5856           
  Misses          ?      172           
  Partials        ?        0           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@kozlov721 kozlov721 changed the title feat SequentialLR SequentialLR Scheduler Oct 11, 2024
Copy link
Collaborator

@kozlov721 kozlov721 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@klemen1999 klemen1999 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, let's add some tests for this - I suppose we need to test configure_optimizers function as a whole by trying different optimizer and scheduler configurations.
CC: @kozlov721 on where would be the best way to put them. IMO under unittests but for them to run we need a LuxonisLighnitngModule instance I think so not sure if this then falls under integration tests?

@kozlov721
Copy link
Collaborator

LGTM, let's add some tests for this - I suppose we need to test configure_optimizers function as a whole by trying different optimizer and scheduler configurations. CC: @kozlov721 on where would be the best way to put them. IMO under unittests but for them to run we need a LuxonisLighnitngModule instance I think so not sure if this then falls under integration tests?

I don't think this can be easily unit-tested. We can add test_scheduler to integration tests that would just start a training with a simple config and 2 epochs with a change of scheduler after the first epoch.

@klemen1999
Copy link
Collaborator

klemen1999 commented Oct 11, 2024

We can add test_scheduler to integration tests that would just start a training with a simple config and 2 epochs with a change of scheduler after the first epoch.

Yes this makes sense. But can we use like a 10 image dataset (or as small as possible) for such traning runs so we don't increase the test times? We should use a very small dataset for all tests where we don't actually care about the performance and we are just testing funcionality.

@JSabadin
Copy link
Contributor Author

Will create a test for this as mentioned above. We can also reduce the CIFAR dataset to just about 10 images here.

@klemen1999 klemen1999 merged commit 9c0cb4c into main Oct 18, 2024
9 checks passed
@klemen1999 klemen1999 deleted the feat/SequentialLR branch October 18, 2024 13:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request release New version release
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants