Parametric ReLU layer
This card is a wrapper of this Keras class.
Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Alpha initializer — NeuralNetworkInitializer
Initializer function for the weights. If not specified, then Zeros initializer is used.
-
Alpha regularizer — NeuralNetworkRegularizer
Regularizer for the weights.
-
Alpha constraint — NeuralNetworkConstraint
Constraint for the weights.
-
Shared axes — List of Integer
The axes along which to share learnable parameters for the activation function. For example, if the incoming feature maps are from a 2D convolution with output shape (
batch
,height
,width
,channels
), and you wish to share parameters across space so that each filter only has one set of parameters, set Shared axes = “1, 2”. -
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.