Table Of Contents
Table Of Contents

PolyScheduler

class mxnet.lr_scheduler.PolyScheduler(max_update, base_lr=0.01, pwr=2)[source]

Reduce the learning rate by given a list of steps.

Calculate the new learning rate by:

base_lr * (1-nup/max_nup)^pwr
if nup < max_nup, 0 otherwise.
Parameters:
  • max_update (maximum number of updates before the decay reaches 0.) –
  • base_lr (base learning rate) –
  • pwr (power of the decay term as a funtion of the current number of updates.) –
__init__(max_update, base_lr=0.01, pwr=2)[source]

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__(max_update[, base_lr, pwr]) Initialize self.