| Interface | Description |
|---|---|
| LayerHelper |
| Class | Description |
|---|---|
| AbstractLayer<LayerConfT extends Layer> |
A layer with input and output, no parameters or gradients
|
| ActivationLayer |
Activation Layer
Used to apply activation on input and corresponding derivative on epsilon.
|
| BaseLayer<LayerConfT extends BaseLayer> |
A layer with parameters
|
| BaseOutputLayer<LayerConfT extends BaseOutputLayer> |
Output layer with different objective
in co-occurrences for different objectives.
|
| BasePretrainNetwork<LayerConfT extends BasePretrainNetwork> |
Baseline class for any Neural Network used
as a layer in a deep network *
|
| DropoutLayer |
Created by davekale on 12/7/16.
|
| FrozenLayer |
For purposes of transfer learning
A frozen layers wraps another dl4j layer within it.
|
| FrozenLayerWithBackprop |
Frozen layer freezes parameters of the layer it wraps, but allows the backpropagation to continue.
|
| LossLayer |
LossLayer is a flexible output "layer" that performs a loss function on
an input without MLP logic.
|
| OutputLayer |
Output layer with different objective
incooccurrences for different objectives.
|
| RepeatVector |
RepeatVector layer.
|
Copyright © 2020. All rights reserved.