| Class | Description |
|---|---|
| AbstractLSTM |
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
| AbstractLSTM.Builder<T extends AbstractLSTM.Builder<T>> | |
| ActivationLayer | |
| ActivationLayer.Builder | |
| AutoEncoder |
Autoencoder.
|
| AutoEncoder.Builder | |
| BaseLayer |
A neural network layer.
|
| BaseLayer.Builder<T extends BaseLayer.Builder<T>> | |
| BaseOutputLayer | |
| BaseOutputLayer.Builder<T extends BaseOutputLayer.Builder<T>> | |
| BasePretrainNetwork | |
| BasePretrainNetwork.Builder<T extends BasePretrainNetwork.Builder<T>> | |
| BaseRecurrentLayer | |
| BaseRecurrentLayer.Builder<T extends BaseRecurrentLayer.Builder<T>> | |
| BatchNormalization |
Batch normalization configuration
|
| BatchNormalization.Builder | |
| CenterLossOutputLayer |
Center loss is similar to triplet loss except that it enforces
intraclass consistency and doesn't require feed forward of multiple
examples.
|
| CenterLossOutputLayer.Builder | |
| Convolution1DLayer |
1D (temporal) convolutional layer.
|
| Convolution1DLayer.Builder | |
| ConvolutionLayer | |
| ConvolutionLayer.BaseConvBuilder<T extends ConvolutionLayer.BaseConvBuilder<T>> | |
| ConvolutionLayer.Builder | |
| DenseLayer |
Dense layer: fully connected feed forward layer trainable by backprop.
|
| DenseLayer.Builder | |
| DropoutLayer | |
| DropoutLayer.Builder | |
| EmbeddingLayer |
Embedding layer: feed-forward layer that expects single integers per example as input (class numbers, in range 0 to numClass-1)
as input.
|
| EmbeddingLayer.Builder | |
| FeedForwardLayer |
Created by jeffreytang on 7/21/15.
|
| FeedForwardLayer.Builder<T extends FeedForwardLayer.Builder<T>> | |
| GlobalPoolingLayer |
Global pooling layer - used to do pooling over time for RNNs, and 2d pooling for CNNs.
Supports the following PoolingTypes: SUM, AVG, MAX, PNORMGlobal pooling layer can also handle mask arrays when dealing with variable length inputs. |
| GlobalPoolingLayer.Builder | |
| GravesBidirectionalLSTM |
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
| GravesBidirectionalLSTM.Builder | |
| GravesLSTM |
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
| GravesLSTM.Builder | |
| InputTypeUtil |
Utilities for calculating input types
|
| Layer |
A neural network layer.
|
| Layer.Builder<T extends Layer.Builder<T>> | |
| LayerValidation |
Created by Alex on 22/02/2017.
|
| LocalResponseNormalization |
Created by nyghtowl on 10/29/15.
|
| LocalResponseNormalization.Builder | |
| LossLayer |
LossLayer is a flexible output "layer" that performs a loss function on
an input without MLP logic.
|
| LossLayer.Builder | |
| LSTM |
LSTM recurrent net without peephole connections.
|
| LSTM.Builder | |
| OutputLayer |
Output layer with different objective co-occurrences for different objectives.
|
| OutputLayer.Builder | |
| RBM |
Restricted Boltzmann Machine.
|
| RBM.Builder | |
| RnnOutputLayer | |
| RnnOutputLayer.Builder | |
| Subsampling1DLayer |
1D (temporal) subsampling layer.
|
| Subsampling1DLayer.Builder | |
| SubsamplingLayer |
Subsampling layer also referred to as pooling in convolution neural nets
Supports the following pooling types:
MAX
AVG
NON
|
| SubsamplingLayer.BaseSubsamplingBuilder<T extends SubsamplingLayer.BaseSubsamplingBuilder<T>> | |
| SubsamplingLayer.Builder | |
| ZeroPaddingLayer |
Zero padding layer for convolutional neural networks.
|
| ZeroPaddingLayer.Builder |
| Enum | Description |
|---|---|
| ConvolutionLayer.AlgoMode |
The "PREFER_FASTEST" mode will pick the fastest algorithm for the specified parameters
from the
ConvolutionLayer.FwdAlgo, ConvolutionLayer.BwdFilterAlgo, and ConvolutionLayer.BwdDataAlgo lists, but they
may be very memory intensive, so if weird errors occur when using cuDNN, please try the
"NO_WORKSPACE" mode. |
| ConvolutionLayer.BwdDataAlgo |
The backward data algorithm to use when
ConvolutionLayer.AlgoMode is set to "USER_SPECIFIED". |
| ConvolutionLayer.BwdFilterAlgo |
The backward filter algorithm to use when
ConvolutionLayer.AlgoMode is set to "USER_SPECIFIED". |
| ConvolutionLayer.FwdAlgo |
The forward algorithm to use when
ConvolutionLayer.AlgoMode is set to "USER_SPECIFIED". |
| PoolingType |
Created by Alex on 17/01/2017.
|
| RBM.HiddenUnit | |
| RBM.VisibleUnit | |
| SubsamplingLayer.PoolingType |
Copyright © 2017. All rights reserved.