Table Of Contents
Table Of Contents


mxnet.ndarray.Activation(data=None, act_type=_Null, out=None, name=None, **kwargs)

Applies an activation function element-wise to the input.

The following activation functions are supported:

  • relu: Rectified Linear Unit, \(y = max(x, 0)\)

  • sigmoid: \(y = \frac{1}{1 + exp(-x)}\)

  • tanh: Hyperbolic tangent, \(y = \frac{exp(x) - exp(-x)}{exp(x) + exp(-x)}\)

  • softrelu: Soft ReLU, or SoftPlus, \(y = log(1 + exp(x))\)

  • softsign: \(y = \frac{x}{1 + abs(x)}\)

Defined in src/operator/nn/

  • data (NDArray) – The input array.

  • act_type ({'relu', 'sigmoid', 'softrelu', 'softsign', 'tanh'}, required) – Activation function to be applied.

  • out (NDArray, optional) – The output NDArray to hold the result.


out – The output of this function.

Return type

NDArray or list of NDArrays