Table Of Contents
Table Of Contents

mxnet.lr_scheduler.MultiFactorScheduler

class mxnet.lr_scheduler.MultiFactorScheduler(step, factor=1, base_lr=0.01, warmup_steps=0, warmup_begin_lr=0, warmup_mode='linear')[source]

Reduce the learning rate by given a list of steps.

Assume there exists k such that:

step[k] <= num_update and num_update < step[k+1]

Then calculate the new learning rate by:

base_lr * pow(factor, k+1)
Parameters
  • step (list of int) – The list of steps to schedule a change

  • factor (float) – The factor to change the learning rate.

  • warmup_steps (int) – number of warmup steps used before this scheduler starts decay

  • warmup_begin_lr (float) – if using warmup, the learning rate from which it starts warming up

  • warmup_mode (string) – warmup can be done in two modes. ‘linear’ mode gradually increases lr with each step in equal increments ‘constant’ mode keeps lr at warmup_begin_lr for warmup_steps

__init__(step, factor=1, base_lr=0.01, warmup_steps=0, warmup_begin_lr=0, warmup_mode='linear')[source]

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__(step[, factor, base_lr, …])

Initialize self.

get_warmup_lr(num_update)