Table Of Contents
Table Of Contents

PythonLossModule

class mxnet.module.PythonLossModule(name='pyloss', data_names=('data', ), label_names=('softmax_label', ), logger=<module 'logging' from '/var/lib/jenkins/miniconda3/envs/mxnet-docs/lib/python3.7/logging/__init__.py'>, grad_func=None)[source]

A convenient module class that implements many of the module APIs as empty functions.

Parameters:
  • name (str) – Names of the module. The outputs will be named [name + ‘_output’].
  • data_names (list of str) – Defaults to ['data']. Names of the data expected by this module. Should be a list of only one name.
  • label_names (list of str) – Default ['softmax_label']. Names of the labels expected by the module. Should be a list of only one name.
  • grad_func (function) – Optional. If not None, should be a function that takes scores and labels, both of type NDArray, and return the gradients with respect to the scores according to this loss function. The return value could be a numpy array or an NDArray.
__init__(name='pyloss', data_names=('data', ), label_names=('softmax_label', ), logger=<module 'logging' from '/var/lib/jenkins/miniconda3/envs/mxnet-docs/lib/python3.7/logging/__init__.py'>, grad_func=None)[source]

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__([name, data_names, label_names, …]) Initialize self.
backward([out_grads]) Backward computation.
bind(data_shapes[, label_shapes, …]) Binds the symbols to construct executors.
fit(train_data[, eval_data, eval_metric, …]) Trains the module parameters.
forward(data_batch[, is_train]) Forward computation.
forward_backward(data_batch) A convenient function that calls both forward and backward.
get_input_grads([merge_multi_context]) Gets the gradients to the inputs, computed in the previous backward computation.
get_outputs([merge_multi_context]) Gets outputs of the previous forward computation.
get_params() Gets parameters, those are potentially copies of the the actual parameters used to do computation on the device.
get_states([merge_multi_context]) Gets states from all devices
init_optimizer([kvstore, optimizer, …]) Installs and initializes optimizers.
init_params([initializer, arg_params, …]) Initializes the parameters and auxiliary states.
install_monitor(mon) Installs monitor on all executors.
iter_predict(eval_data[, num_batch, reset, …]) Iterates over predictions.
load_params(fname) Loads model parameters from file.
predict(eval_data[, num_batch, …]) Runs prediction and collects the outputs.
prepare(data_batch[, sparse_row_id_fn]) Prepares the module for processing a data batch.
save_params(fname) Saves model parameters to file.
score(eval_data, eval_metric[, num_batch, …]) Runs prediction on eval_data and evaluates the performance according to the given eval_metric.
set_params(arg_params, aux_params[, …]) Assigns parameter and aux state values.
set_states([states, value]) Sets value for states.
update() Updates parameters according to the installed optimizer and the gradients computed in the previous forward-backward batch.
update_metric(eval_metric, labels[, pre_sliced]) Evaluates and accumulates evaluation metric on outputs of the last forward computation.

Attributes

data_names A list of names for data required by this module.
data_shapes A list of (name, shape) pairs specifying the data inputs to this module.
label_shapes A list of (name, shape) pairs specifying the label inputs to this module.
output_names A list of names for the outputs of this module.
output_shapes A list of (name, shape) pairs specifying the outputs of this module.
symbol Gets the symbol associated with this module.