Separable convolution 1D layer
This card is a wrapper of this Keras class.
Note: the backend for building and training neural networks is based on Keras. The documentation of this card is a variant of the documentation of its corresponding class.
Inputs
-
Filters — Integer
The dimensionality of the output space (i.e. the number of output filters in the convolution).
-
Kernel size — Integer
The length of the 1D convolution window.
-
Strides — Integer
The strides of the convolution. Specifying any value different than “1” is incompatible with specifying any Dilation rate different than “1”.
-
Padding — String
The padding algorithm.
-
Data format — String
The ordering of the dimensions in the inputs.
-
Dilation rate — Integer
The dilation rate to use for dilated convolution. Specifying any value different than “1” is incompatible with specifying any Strides different than “1”.
-
Depth multiplier — Integer
The number of depthwise convolution output channels for each input channel. The total number of depthwise convolution output channels will be equal to the number of input filters multiplied by Depth multiplier.
-
Activation — String
Activation function to use.
-
Use bias — Boolean
Whether the layer uses a bias vector.
-
Depthwise initializer — NeuralNetworkInitializer
An initializer for the depthwise convolution kernel. If not specified, then Glorot uniform initializer is used.
-
Pointwise initializer — NeuralNetworkInitializer
An initializer for the pointwise convolution kernel. If not specified, then Glorot uniform initializer is used.
-
Bias initializer — NeuralNetworkInitializer
Initializer for the bias vector. If not specified, then Zeros initializer is used.
-
Depthwise regularizer — NeuralNetworkRegularizer
Optional regularizer for the depthwise convolution kernel.
-
Pointwise regularizer — NeuralNetworkRegularizer
Optional regularizer for the pointwise convolution kernel.
-
Bias regularizer — NeuralNetworkRegularizer
Regularizer function applied to the bias vector.
-
Activity regularizer — NeuralNetworkRegularizer
Regularizer function applied to the output of the layer (its “activation”).
-
Depthwise constraint — NeuralNetworkConstraint
Optional projection function to be applied to the depthwise kernel after being updated by the optimizer (e.g. used for norm constraints or value constraints for layer weights). The function must take as input the unprojected variable and must return the projected variable (which must have the same shape). Constraints are not safe to use when doing asynchronous distributed training.
-
Pointwise constraint — NeuralNetworkConstraint
Optional projection function to be applied to the pointwise kernel after being updated by the optimizer.
-
Bias constraint — NeuralNetworkConstraint
Constraint function applied to the bias vector.
-
Input — NeuralNetworkTensor
Input of this layer.
Outputs
-
Layer instance — NeuralNetworkLayer
Instance of this layer. It can be wrapped using a Bidirectional or a TimeDistributed wrapper.
-
Output — NeuralNetworkTensor
Output of this layer.