In this tutorial we are going to be looking at the
PlateauLRScheduler in the
from timm.scheduler.plateau_lr import PlateauLRScheduler from nbdev.showdoc import show_doc
Decay the LR by a factor every time the validation loss plateaus.
PlateauLRScheduler as shown above accepts an
optimizer and also some hyperparams which we will look into in detail below. We will first see how we can train models using the
PlateauLRScheduler by first using
timm training docs and then look at how we can use this scheduler as standalone scheduler for our custom training scripts.
To train models using the
PlateauLRScheduler we simply update the training script args passed by passing in
--sched plateau parameter alongside the necessary hyperparams. In this section we will also look at how each of the hyperparams update the
The training command to use
cosine scheduler looks something like:
python train.py ../imagenette2-320/ --sched plateau
PlateauLRScheduler by default tracks the
eval-metric which is by default
top-1 in the
timm training script. If the performance plateaus, then the new learning learning after a certain number of epochs (by default 10) is set to
lr * decay_rate. This scheduler underneath uses PyTorch's ReduceLROnPlateau.
All arguments passed to this scheduler are the same as PyTorch's
ReduceLROnPlateau except they are renamed as follows: