ReLU layer
This card is a wrapper of this Keras class.
Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Max value — Float
Maximum activation value, equal or greater than 0. If not specified, it means unlimited.
-
Negative slope — Float
Negative slope coefficient, equal or greater than 0.
-
Threshold — Float
Threshold value for thresholded activation.
-
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.