public class RnnLossLayer extends FeedForwardLayer
RnnOutputLayer
this RnnLossLayer does not have any parameters - i.e., there is no
time distributed dense component here. Consequently, the output activations size is equal to the input size.RnnOutputLayer
,
Serialized FormModifier and Type | Class and Description |
---|---|
static class |
RnnLossLayer.Builder |
Modifier and Type | Field and Description |
---|---|
protected ILossFunction |
lossFn |
nIn, nOut, timeDistributedFormat
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoise
constraints, iDropout, layerName
Modifier and Type | Method and Description |
---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessor |
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType) |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
isPretrainParam
clone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfig
initializeConstraints, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalizationThreshold, getLayerName
protected ILossFunction lossFn
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public InputType getOutputType(int layerIndex, InputType inputType)
Layer
getOutputType
in class FeedForwardLayer
layerIndex
- Index of the layerinputType
- Type of input for the layerpublic InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class FeedForwardLayer
inputType
- InputType to this layerpublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typepublic void setNIn(InputType inputType, boolean override)
Layer
setNIn
in class FeedForwardLayer
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it
regardless of whether it's already set or not.Copyright © 2020. All rights reserved.