Class Layer
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- All Implemented Interfaces:
Serializable
,Cloneable
,TrainingConfig
- Direct Known Subclasses:
AbstractSameDiffLayer
,BaseLayer
,BaseWrapperLayer
,Bidirectional
,FrozenLayer
,LocalResponseNormalization
,NoParamLayer
,Yolo2OutputLayer
public abstract class Layer extends Object implements TrainingConfig, Serializable, Cloneable
A neural network layer.- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static class
Layer.Builder<T extends Layer.Builder<T>>
-
Field Summary
Fields Modifier and Type Field Description protected List<LayerConstraint>
constraints
protected IDropout
iDropout
protected String
layerName
-
Constructor Summary
Constructors Constructor Description Layer(Layer.Builder builder)
-
Method Summary
All Methods Instance Methods Abstract Methods Concrete Methods Modifier and Type Method Description Layer
clone()
abstract LayerMemoryReport
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layerabstract InputType
getOutputType(int layerIndex, InputType inputType)
For a given type of input to this layer, what is the type of the output?abstract InputPreProcessor
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriateInputPreProcessor
for this layer, such as aCnnToFeedForwardPreProcessor
abstract List<Regularization>
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdater
getUpdaterByParam(String paramName)
Get the updater for the given parameter.protected void
initializeConstraints(Layer.Builder<?> builder)
Initialize the weight constraints.abstract ParamInitializer
initializer()
abstract Layer
instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
abstract boolean
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.void
resetLayerDefaultConfig()
Reset the learning related configs of the layer to default.void
setDataType(DataType dataType)
abstract void
setNIn(InputType inputType, boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getGradientNormalization, getGradientNormalizationThreshold, getLayerName
-
-
-
-
Field Detail
-
layerName
protected String layerName
-
iDropout
protected IDropout iDropout
-
constraints
protected List<LayerConstraint> constraints
-
-
Constructor Detail
-
Layer
public Layer(Layer.Builder builder)
-
-
Method Detail
-
initializeConstraints
protected void initializeConstraints(Layer.Builder<?> builder)
Initialize the weight constraints. Should be called last, in the outer-most constructor
-
resetLayerDefaultConfig
public void resetLayerDefaultConfig()
Reset the learning related configs of the layer to default. When instantiated with a global neural network configuration the parameters specified in the neural network configuration will be used. For internal use with the transfer learning API. Users should not have to call this method directly.
-
instantiate
public abstract Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
-
initializer
public abstract ParamInitializer initializer()
- Returns:
- The parameter initializer for this model
-
getOutputType
public abstract InputType getOutputType(int layerIndex, InputType inputType)
For a given type of input to this layer, what is the type of the output?- Parameters:
layerIndex
- Index of the layerinputType
- Type of input for the layer- Returns:
- Type of output from the layer
- Throws:
IllegalStateException
- if input type is invalid for this layer
-
setNIn
public abstract void setNIn(InputType inputType, boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type- Parameters:
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it regardless of whether it's already set or not.- Throws:
IllegalStateException
- if input type is invalid for this layer
-
getPreProcessorForInputType
public abstract InputPreProcessor getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriateInputPreProcessor
for this layer, such as aCnnToFeedForwardPreProcessor
- Parameters:
inputType
- InputType to this layer- Returns:
- Null if no preprocessor is required, otherwise the type of preprocessor necessary for this layer/input combination
- Throws:
IllegalStateException
- if input type is invalid for this layer
-
getRegularizationByParam
public abstract List<Regularization> getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParam
in interfaceTrainingConfig
- Parameters:
paramName
- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
isPretrainParam
public abstract boolean isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParam
in interfaceTrainingConfig
- Parameters:
paramName
- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Get the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParam
in interfaceTrainingConfig
- Parameters:
paramName
- Parameter name- Returns:
- IUpdater for the parameter
-
setDataType
public void setDataType(DataType dataType)
- Specified by:
setDataType
in interfaceTrainingConfig
-
getMemoryReport
public abstract LayerMemoryReport getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer- Parameters:
inputType
- Input type to the layer. Memory consumption is often a function of the input type- Returns:
- Memory report for the layer
-
-