Table Of Contents
Table Of Contents

rnn and contrib.rnn

Build-in recurrent neural network layers are provided in the following two modules:

mxnet.gluon.rnn

Recurrent neural network module.

mxnet.gluon.contrib.rnn

Contrib recurrent neural network module.

Recurrent Cells

rnn.LSTMCell

Long-Short Term Memory (LSTM) network cell.

rnn.GRUCell

Gated Rectified Unit (GRU) network cell.

rnn.RecurrentCell

Abstract base class for RNN cells

rnn.SequentialRNNCell

Sequentially stacking multiple RNN cells.

rnn.BidirectionalCell

Bidirectional RNN cell.

rnn.DropoutCell

Applies dropout on input.

rnn.ZoneoutCell

Applies Zoneout on base cell.

rnn.ResidualCell

Adds residual connection as described in Wu et al, 2016 (https://arxiv.org/abs/1609.08144).

contrib.rnn.Conv1DRNNCell

1D Convolutional RNN cell.

contrib.rnn.Conv2DRNNCell

2D Convolutional RNN cell.

contrib.rnn.Conv3DRNNCell

3D Convolutional RNN cells

contrib.rnn.Conv1DLSTMCell

1D Convolutional LSTM network cell.

contrib.rnn.Conv2DLSTMCell

2D Convolutional LSTM network cell.

contrib.rnn.Conv3DLSTMCell

3D Convolutional LSTM network cell.

contrib.rnn.Conv1DGRUCell

1D Convolutional Gated Rectified Unit (GRU) network cell.

contrib.rnn.Conv2DGRUCell

2D Convolutional Gated Rectified Unit (GRU) network cell.

contrib.rnn.Conv3DGRUCell

3D Convolutional Gated Rectified Unit (GRU) network cell.

contrib.rnn.VariationalDropoutCell

Applies Variational Dropout on base cell.

contrib.rnn.LSTMPCell

Long-Short Term Memory Projected (LSTMP) network cell.

Recurrent Layers

rnn.RNN

Applies a multi-layer Elman RNN with tanh or ReLU non-linearity to an input sequence.

rnn.LSTM

Applies a multi-layer long short-term memory (LSTM) RNN to an input sequence.

rnn.GRU

Applies a multi-layer gated recurrent unit (GRU) RNN to an input sequence.