Batch normalization layer
This card is a wrapper of this Keras class.

Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Axis — Integer
The axis that should be normalized (typically the features axis). For instance, after a Convolution 2D layer with Data format = “Channels first”, set Axis = “1”.
-
Momentum — Float
Momentum for the moving average.
-
Epsilon — Float
Small float added to variance to avoid dividing by zero.
-
Center — Boolean
If “true”, add offset of
betato normalized tensor. If “false”,betais ignored. -
Scale — Boolean
If “true”, multiply by
gamma. If “false”,gammais not used. When the next layer is linear (also e.g. ReLU), this can be disabled since the scaling will be done by the next layer. -
Beta initializer — NeuralNetworkInitializer
Initializer for the
betaweight. If not specified, then Zeros initializer is used. -
Gamma initializer — NeuralNetworkInitializer
Initializer for the
gammaweight. If not specified, then Ones initializer is used. -
Moving mean initializer — NeuralNetworkInitializer
Initializer for the moving mean. If not specified, then Zeros initializer is used.
-
Moving variance initializer — NeuralNetworkInitializer
Initializer for the moving variance. If not specified, then Ones initializer is used.
-
Beta regularizer — NeuralNetworkRegularizer
Optional regularizer for the
betaweight. -
Gamma regularizer — NeuralNetworkRegularizer
Optional regularizer for the
gammaweight. -
Beta constraint — NeuralNetworkConstraint
Optional constraint for the
betaweight. -
Gamma constraint — NeuralNetworkConstraint
Optional constraint for the
gammaweight. -
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.