Batch normalization layer
This card is a wrapper of this Keras class.
Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Axis — Integer
The axis that should be normalized (typically the features axis). For instance, after a Convolution 2D layer with Data format = “Channels first”, set Axis = “1”.
-
Momentum — Float
Momentum for the moving average.
-
Epsilon — Float
Small float added to variance to avoid dividing by zero.
-
Center — Boolean
If “true”, add offset of
beta
to normalized tensor. If “false”,beta
is ignored. -
Scale — Boolean
If “true”, multiply by
gamma
. If “false”,gamma
is not used. When the next layer is linear (also e.g. ReLU), this can be disabled since the scaling will be done by the next layer. -
Beta initializer — NeuralNetworkInitializer
Initializer for the
beta
weight. If not specified, then Zeros initializer is used. -
Gamma initializer — NeuralNetworkInitializer
Initializer for the
gamma
weight. If not specified, then Ones initializer is used. -
Moving mean initializer — NeuralNetworkInitializer
Initializer for the moving mean. If not specified, then Zeros initializer is used.
-
Moving variance initializer — NeuralNetworkInitializer
Initializer for the moving variance. If not specified, then Ones initializer is used.
-
Beta regularizer — NeuralNetworkRegularizer
Optional regularizer for the
beta
weight. -
Gamma regularizer — NeuralNetworkRegularizer
Optional regularizer for the
gamma
weight. -
Beta constraint — NeuralNetworkConstraint
Optional constraint for the
beta
weight. -
Gamma constraint — NeuralNetworkConstraint
Optional constraint for the
gamma
weight. -
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.