MomentumScheduler(optimizer: torch.optim.optimizer.Optimizer, last_epoch: int = -1)¶
from_params(optimizer: torch.optim.optimizer.Optimizer, params: allennlp.common.params.Params)¶
This is the automatic implementation of from_params. Any class that subclasses FromParams (or Registrable, which itself subclasses FromParams) gets this implementation for free. If you want your class to be instantiated from params in the “obvious” way – pop off parameters and hand them to your constructor with the same names – this provides that functionality.
If you need more complex logic in your from from_params method, you’ll have to implement your own method that overrides this one.
get_values() → None¶
InvertedTriangular(optimizer: torch.optim.optimizer.Optimizer, cool_down: int, warm_up: int, ratio: int = 10, last_epoch: int = -1)¶
Adjust momentum during training according to an inverted triangle-like schedule.
The momentum starts off high, then decreases linearly for
cool_downepochs, until reaching
1 / ratioth of the original value. Then the momentum increases linearly for
warm_upepochs until reaching its original value again. If there are still more epochs left over to train, the momentum will stay flat at the original value.