public class LossLayer extends FeedForwardLayer
OutputLayer in that both perform loss calculations for network outputs vs. labels, but LossLayer
does not have any parameters. Consequently, setting nIn/nOut isn't supported - the output size is the same size as
the input activations.| Modifier and Type | Class and Description |
|---|---|
static class |
LossLayer.Builder |
| Modifier and Type | Field and Description |
|---|---|
protected ILossFunction |
lossFn |
nIn, nOut, timeDistributedFormatactivationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoiseconstraints, iDropout, layerName| Modifier | Constructor and Description |
|---|---|
protected |
LossLayer(LossLayer.Builder builder) |
| Modifier and Type | Method and Description |
|---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
getOutputType, getPreProcessorForInputType, setNInclone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfiginitializeConstraints, setDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetGradientNormalizationThreshold, getLayerNameprotected ILossFunction lossFn
protected LossLayer(LossLayer.Builder builder)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate in class Layerpublic boolean isPretrainParam(String paramName)
LayerisPretrainParam in interface TrainingConfigisPretrainParam in class FeedForwardLayerparamName - Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class LayerinputType - Input type to the layer. Memory consumption is often a function of the input
typepublic ParamInitializer initializer()
initializer in class LayerCopyright © 2020. All rights reserved.