public class LossLayer extends FeedForwardLayer
OutputLayer
in that both perform loss calculations for network outputs vs. labels, but LossLayer
does not have any parameters. Consequently, setting nIn/nOut isn't supported - the output size is the same size as
the input activations.Modifier and Type | Class and Description |
---|---|
static class |
LossLayer.Builder |
Modifier and Type | Field and Description |
---|---|
protected ILossFunction |
lossFn |
nIn, nOut, timeDistributedFormat
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoise
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
LossLayer(LossLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
getOutputType, getPreProcessorForInputType, setNIn
clone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfig
initializeConstraints, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalizationThreshold, getLayerName
protected ILossFunction lossFn
protected LossLayer(LossLayer.Builder builder)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate
in class Layer
public boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in interface TrainingConfig
isPretrainParam
in class FeedForwardLayer
paramName
- Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typepublic ParamInitializer initializer()
initializer
in class Layer
Copyright © 2020. All rights reserved.