Table Of Contents
Table Of Contents

mx.symbol.Activation

Description

Applies an activation function element-wise to the input.

The following activation functions are supported:

  • relu: Rectified Linear Unit, \(y = max(x, 0)\)

  • sigmoid: \(y = \frac{1}{1 + exp(-x)}\)

  • tanh: Hyperbolic tangent, \(y = \frac{exp(x) - exp(-x)}{exp(x) + exp(-x)}\)

  • softrelu: Soft ReLU, or SoftPlus, \(y = log(1 + exp(x))\)

  • softsign: \(y = \frac{x}{1 + abs(x)}\)

Usage

mx.symbol.Activation(...)

Arguments

Argument

Description

data

NDArray-or-Symbol.

The input array.

act.type

{‘relu’, ‘sigmoid’, ‘softrelu’, ‘softsign’, ‘tanh’}, required.

Activation function to be applied.

name

string, optional.

Name of the resulting symbol.