Layer normalization layer
This card is a wrapper of this Keras class.
Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Axes — List of Integer
The axis or axes to normalize across. Typically this is the features axis/axes. The left-out axes are typically the batch axis/axes. This argument defaults to -1, the last dimension in the input.
-
Epsilon — Float
Small float added to variance to avoid dividing by zero.
-
Center — Boolean
If “true”, add offset of
beta
to normalized tensor. If “false”,beta
is ignored. -
Scale — Boolean
If “true”, multiply by
gamma
. If “false”,gamma
is not used. When the next layer is linear (also e.g. ReLU), this can be disabled since the scaling will be done by the next layer. -
Beta initializer — NeuralNetworkInitializer
Initializer for the
beta
weight. If not specified, then Zeros initializer is used. -
Gamma initializer — NeuralNetworkInitializer
Initializer for the
gamma
weight. If not specified, then Ones initializer is used. -
Beta regularizer — NeuralNetworkRegularizer
Optional regularizer for the
beta
weight. -
Gamma regularizer — NeuralNetworkRegularizer
Optional regularizer for the
gamma
weight. -
Beta constraint — NeuralNetworkConstraint
Optional constraint for the
beta
weight. -
Gamma constraint — NeuralNetworkConstraint
Optional constraint for the
gamma
weight. -
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.