Untuned Warmup
- class pytorch_warmup.untuned.UntunedExponentialWarmup(optimizer, last_step=-1)[source]
Untuned exponetial warmup schedule for Adam.
This warmup scheme is described in On the adequacy of untuned warmup for adaptive optimization.
- Parameters:
optimizer (Optimizer) – an Adam optimizer
last_step (int) – The index of last step. (Default: -1)
- class pytorch_warmup.untuned.UntunedLinearWarmup(optimizer, last_step=-1)[source]
Untuned linear warmup schedule for Adam.
This warmup scheme is described in On the adequacy of untuned warmup for adaptive optimization.
- Parameters:
optimizer (Optimizer) – an Adam optimizer
last_step (int) – The index of last step. (Default: -1)