Simple activation function to be applied to the output.
Add a (non-learnable) scalar constant to the input.
Applies an atrous convolution operator for filtering neighborhoods of 1-D inputs.
Applies an atrous convolution operator for filtering windows of 2-D inputs.
Applies average pooling operation for temporal data.
Applies average pooling operation for spatial data.
Applies average pooling operation for 3D data (spatial or spatio-temporal).
Batch normalization layer.
Bidirectional wrapper for RNNs.
Threshold the input.
This layer has a bias with given size.
This layer has a weight with given size.
Convolutional LSTM.
Applies convolution operator for filtering neighborhoods of 1-D inputs.
Applies a 2D convolution over an input image composed of several input planes.
Applies convolution operator for filtering windows of three-dimensional inputs.
Cropping layer for 1D input (e.
Cropping layer for 2D input (e.
Cropping layer for 3D data (e.
Transposed convolution operator for filtering windows of 2-D inputs.
A densely-connected NN layer.
Applies Dropout to the input by randomly setting a fraction 'p' of input units to 0 at each update during training time in order to prevent overfitting.
Exponential Linear Unit.
Turn positive integers (indexes) into dense vectors of fixed size.
Applies element-wise exp to the input.
Flattens the input without affecting the batch size.
Gated Recurrent Unit architecture.
Apply multiplicative 1-centered Gaussian noise.
Apply additive zero-centered Gaussian noise.
Takes {mean, log_variance} as input and samples from the Gaussian distribution
Applies global average pooling operation for temporal data.
Applies global average pooling operation for spatial data.
Applies global average pooling operation for 3D data.
Applies global max pooling operation for temporal data.
Applies global max pooling operation for spatial data.
Applies global max pooling operation for 3D data.
Applies the hard shrinkage function element-wise to the input.
Applies the hard tanh function element-wise to the input.
Densely connected highway network.
Wrap a torch style layer to keras style layer.
Local Response Normalization between different feature maps.
Long Short Term Memory unit architecture.
Leaky version of a Rectified Linear Unit.
Locally-connected layer for 1D inputs which works similarly to the TemporalConvolution layer, except that weights are unshared, that is, a different set of filters is applied at each different patch of the input.
Locally-connected layer for 2D inputs that works similarly to the SpatialConvolution layer, except that weights are unshared, that is, a different set of filters is applied at each different patch of the input.
Applies a log transformation to the input.
Use a mask value to skip timesteps for a sequence.
Applies max pooling operation for temporal data.
Applies max pooling operation for spatial data.
Applies max pooling operation for 3D data (spatial or spatio-temporal).
A dense maxout layer that takes the element-wise maximum of linear layers.
Used to merge a list of inputs into a single output, following some merge mode.
Multiply a single scalar factor to the input.
Multiply the input by a (non-learnable) scalar constant.
Narrow the input with the number of dimensions not being reduced.
Computes the negative value of each element of the input.
Applies parametric ReLU, where parameter varies the slope of the negative part.
Permutes the dimensions of the input according to a given pattern.
Applies an element-wise power operation with scale and shift to the input.
Applies the randomized leaky rectified linear unit element-wise to the input.
Repeats the input n times.
Reshapes an output to a certain shape.
Resize the input image with bilinear interpolation.
S-shaped Rectified Linear Unit.
Scale is the combination of CMul and CAdd.
Select an index of the input in the given dim and return the subset part.
Applies separable convolution operator for 2D inputs.
Applies a 2D convolution over an input image composed of several input planes.
A fully-connected recurrent neural network cell.
Applies the soft shrinkage function element-wise to the input.
Spatial 1D version of Dropout.
Spatial 2D version of Dropout.
Spatial 3D version of Dropout.
Applies an element-wise square root operation to the input.
Applies an element-wise square operation to the input.
Delete the singleton dimension(s).
Threshold input Tensor.
Thresholded Rectified Linear Unit.
TimeDistributed wrapper.
UpSampling layer for 1D inputs.
UpSampling layer for 2D inputs.
UpSampling layer for 3D inputs.
The local response normalization layer performs a kind of "lateral inhibition" by normalizing over local input regions.
Zero-padding layer for 1D input (e.
Zero-padding layer for 2D input (e.
Zero-padding layer for 3D data (spatial or spatio-temporal).
Used to instantiate an input node.
Used as an entry point into a model.