Skip to content

Commit

Permalink
Add LRSchedulerFactory docs
Browse files Browse the repository at this point in the history
  • Loading branch information
takuseno committed Oct 19, 2024
1 parent d324094 commit 7a17efe
Show file tree
Hide file tree
Showing 2 changed files with 28 additions and 1 deletion.
2 changes: 1 addition & 1 deletion d3rlpy/models/lr_schedulers.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ class WarmupSchedulerFactory(LRSchedulerFactory):
.. math::
lr = \max((t + 1) / warmup_steps, 1)
lr = \max((t + 1) / warmup\_steps, 1)
Args:
warmup_steps: Warmup steps.
Expand Down
27 changes: 27 additions & 0 deletions docs/references/optimizers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,3 +38,30 @@ There are also convenient alises.
d3rlpy.models.AdamFactory
d3rlpy.models.RMSpropFactory
d3rlpy.models.GPTAdamWFactory


Learning rate scheduler
~~~~~~~~~~~~~~~~~~~~~~~

d3rlpy provides ``LRSchedulerFactory`` that gives you configure learning rate
schedulers with ``OptimizerFactory``.

.. code-block:: python
import d3rlpy
# set lr_scheduler_factory
optim_factory = d3rlpy.models.AdamFactory(
lr_scheduler_factory=d3rlpy.models.WarmupSchedulerFactory(
warmup_steps=10000
)
)
.. autosummary::
:toctree: generated/
:nosignatures:

d3rlpy.models.LRSchedulerFactory
d3rlpy.models.WarmupSchedulerFactory
d3rlpy.models.CosineAnnealingLRFactory

0 comments on commit 7a17efe

Please sign in to comment.