Table Of Contents
Table Of Contents


class mxnet.autograd.Function[source]

Customize differentiation in autograd.

If you don’t want to use the gradients computed by the default chain-rule, you can use Function to customize differentiation for computation. You define your computation in the forward method and provide the customized differentiation in the backward method. During gradient computation, autograd will use the user-defined backward function instead of the default chain-rule. You can also cast to numpy array and back for some operations in forward and backward.

For example, a stable sigmoid function can be defined as:

class sigmoid(mx.autograd.Function):
    def forward(self, x):
        y = 1 / (1 + mx.nd.exp(-x))
        return y

    def backward(self, dy):
        # backward takes as many inputs as forward's return value,
        # and returns as many NDArrays as forward's arguments.
        y, = self.saved_tensors
        return dy * y * (1-y)

Then, the function can be used in the following way:

func = sigmoid()
x = mx.nd.random.uniform(shape=(10,))

with mx.autograd.record():
    m = func(x)
dx = x.grad.asnumpy()

Initialize self. See help(type(self)) for accurate signature.



Initialize self.


Backward computation.


Forward computation.