Layer normalization layer
This card is a wrapper of this Keras class.

Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Axes — List of Integer
The axis or axes to normalize across. Typically this is the features axis/axes. The left-out axes are typically the batch axis/axes. This argument defaults to -1, the last dimension in the input.
-
Epsilon — Float
Small float added to variance to avoid dividing by zero.
-
Center — Boolean
If “true”, add offset of
betato normalized tensor. If “false”,betais ignored. -
Scale — Boolean
If “true”, multiply by
gamma. If “false”,gammais not used. When the next layer is linear (also e.g. ReLU), this can be disabled since the scaling will be done by the next layer. -
Beta initializer — NeuralNetworkInitializer
Initializer for the
betaweight. If not specified, then Zeros initializer is used. -
Gamma initializer — NeuralNetworkInitializer
Initializer for the
gammaweight. If not specified, then Ones initializer is used. -
Beta regularizer — NeuralNetworkRegularizer
Optional regularizer for the
betaweight. -
Gamma regularizer — NeuralNetworkRegularizer
Optional regularizer for the
gammaweight. -
Beta constraint — NeuralNetworkConstraint
Optional constraint for the
betaweight. -
Gamma constraint — NeuralNetworkConstraint
Optional constraint for the
gammaweight. -
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.