Table Of Contents
Table Of Contents

loss

Gluon provides pre-defined loss functions in the mxnet.gluon.parameter module.

losses for training neural networks

Loss(weight, batch_axis, **kwargs)

Base class for loss.

L2Loss([weight, batch_axis])

Calculates the mean squared error between pred and label.

L1Loss([weight, batch_axis])

Calculates the mean absolute error between pred and label.

SigmoidBinaryCrossEntropyLoss([…])

The cross-entropy loss for binary classification.

SoftmaxCrossEntropyLoss([axis, …])

Computes the softmax cross entropy loss.

KLDivLoss([from_logits, axis, weight, …])

The Kullback-Leibler divergence loss.

HuberLoss([rho, weight, batch_axis])

Calculates smoothed L1 loss that is equal to L1 loss if absolute error exceeds rho but is equal to L2 loss otherwise.

HingeLoss([margin, weight, batch_axis])

Calculates the hinge loss function often used in SVMs:

SquaredHingeLoss([margin, weight, batch_axis])

Calculates the soft-margin loss function used in SVMs:

LogisticLoss([weight, batch_axis, label_format])

Calculates the logistic loss (for binary losses only):

TripletLoss([margin, weight, batch_axis])

Calculates triplet loss given three input tensors and a positive margin.

CTCLoss([layout, label_layout, weight])

Connectionist Temporal Classification Loss.