Table Of Contents
Table Of Contents

softmax_cross_entropy

mxnet.ndarray.softmax_cross_entropy(data=None, label=None, out=None, name=None, **kwargs)

Calculate cross entropy of softmax output and one-hot label.

  • This operator computes the cross entropy in two steps: - Applies softmax function on the input array. - Computes and returns the cross entropy loss between the softmax output and the labels.

  • The softmax function and cross entropy loss is given by:

    • Softmax Function:
    \[\text{softmax}(x)_i = \frac{exp(x_i)}{\sum_j exp(x_j)}\]
    • Cross Entropy Function:
    \[\text{CE(label, output)} = - \sum_i \text{label}_i \log(\text{output}_i)\]

Example:

x = [[1, 2, 3],
     [11, 7, 5]]

label = [2, 0]

softmax(x) = [[0.09003057, 0.24472848, 0.66524094],
              [0.97962922, 0.01794253, 0.00242826]]

softmax_cross_entropy(data, label) = - log(0.66524084) - log(0.97962922) = 0.4281871

Defined in src/operator/loss_binary_op.cc:L59

Parameters:
  • data (NDArray) – Input data
  • label (NDArray) – Input label
  • out (NDArray, optional) – The output NDArray to hold the result.
Returns:

out – The output of this function.

Return type:

NDArray or list of NDArrays