Table Of Contents
Table Of Contents


class mxnet.gluon.nn.InstanceNorm(axis=1, epsilon=1e-05, center=True, scale=False, beta_initializer='zeros', gamma_initializer='ones', in_channels=0, **kwargs)[source]

Applies instance normalization to the n-dimensional input array. This operator takes an n-dimensional input array where (n>2) and normalizes the input using the following formula:

\[ \begin{align}\begin{aligned}\bar{C} = \{i \mid i \neq 0, i \neq axis\}\\out = \frac{x - mean[data, \bar{C}]}{ \sqrt{Var[data, \bar{C}]} + \epsilon} * gamma + beta\end{aligned}\end{align} \]
  • axis (int, default 1) – The axis that will be excluded in the normalization process. This is typically the channels (C) axis. For instance, after a Conv2D layer with layout=’NCHW’, set axis=1 in InstanceNorm. If layout=’NHWC’, then set axis=3. Data will be normalized along axes excluding the first axis and the axis given.

  • epsilon (float, default 1e-5) – Small float added to variance to avoid dividing by zero.

  • center (bool, default True) – If True, add offset of beta to normalized tensor. If False, beta is ignored.

  • scale (bool, default True) – If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. nn.relu), this can be disabled since the scaling will be done by the next layer.

  • beta_initializer (str or Initializer, default ‘zeros’) – Initializer for the beta weight.

  • gamma_initializer (str or Initializer, default ‘ones’) – Initializer for the gamma weight.

  • in_channels (int, default 0) – Number of channels (feature maps) in input data. If not specified, initialization will be deferred to the first time forward is called and in_channels will be inferred from the shape of input data.

  • data: input tensor with arbitrary shape.

  • out: output tensor with the same shape as data.


Instance Normalization: The Missing Ingredient for Fast Stylization


>>> # Input of shape (2,1,2)
>>> x = mx.nd.array([[[ 1.1,  2.2]],
...                 [[ 3.3,  4.4]]])
>>> # Instance normalization is calculated with the above formula
>>> layer = InstanceNorm()
>>> layer.initialize(ctx=mx.cpu(0))
>>> layer(x)
[[[-0.99998355  0.99998331]]
 [[-0.99998319  0.99998361]]]
<NDArray 2x1x2 @cpu(0)>
__init__(axis=1, epsilon=1e-05, center=True, scale=False, beta_initializer='zeros', gamma_initializer='ones', in_channels=0, **kwargs)[source]

Initialize self. See help(type(self)) for accurate signature.


__init__([axis, epsilon, center, scale, …])

Initialize self.


Applies fn recursively to every child block as well as self.


Cast this Block to use another data type.


Returns a ParameterDict containing this Block and all of its children’s Parameters(default), also can returns the select ParameterDict which match some given regular expressions.

export(path[, epoch])

Export HybridBlock to json format that can be loaded by SymbolBlock.imports, mxnet.mod.Module or the C++ interface.

forward(x, *args)

Defines the forward computation.

hybrid_forward(F, x, gamma, beta)

Overrides to construct symbolic graph for this Block.


Activates or deactivates HybridBlock s recursively.


Infers shape of Parameters from inputs.


Infers data type of Parameters from inputs.

initialize([init, ctx, verbose, force_reinit])

Initializes Parameter s of this Block and its children.

load_parameters(filename[, ctx, …])

Load parameters from file previously saved by save_parameters.

load_params(filename[, ctx, allow_missing, …])

[Deprecated] Please use load_parameters.


Returns a name space object managing a child Block and parameter names.

register_child(block[, name])

Registers block as a child of self.


Registers a forward hook on the block.


Registers a forward pre-hook on the block.


Save parameters to file.


[Deprecated] Please use save_parameters.


Print the summary of the model’s output and parameters.



Name of this Block, without ‘_’ in the end.


Returns this Block’s parameter dictionary (does not include its children’s parameters).


Prefix of this Block.