Class FrozenLayerWithBackprop
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- org.deeplearning4j.nn.conf.layers.wrapper.BaseWrapperLayer
-
- org.deeplearning4j.nn.conf.layers.misc.FrozenLayerWithBackprop
-
- All Implemented Interfaces:
Serializable
,Cloneable
,TrainingConfig
public class FrozenLayerWithBackprop extends BaseWrapperLayer
Frozen layer freezes parameters of the layer it wraps, but allows the backpropagation to continue.- Author:
- Ugljesa Jovanovic ([email protected]) on 06/05/2018.
- See Also:
FrozenLayer
, Serialized Form
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from class org.deeplearning4j.nn.conf.layers.Layer
Layer.Builder<T extends Layer.Builder<T>>
-
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.wrapper.BaseWrapperLayer
underlying
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer
constraints, iDropout, layerName
-
-
Constructor Summary
Constructors Constructor Description FrozenLayerWithBackprop(Layer layer)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description Layer
clone()
NeuralNetConfiguration
getInnerConf(NeuralNetConfiguration conf)
List<Regularization>
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdater
getUpdaterByParam(String paramName)
Get the updater for the given parameter.ParamInitializer
initializer()
Layer
instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
boolean
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.void
setConstraints(List<LayerConstraint> constraints)
void
setLayerName(String layerName)
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.wrapper.BaseWrapperLayer
getGradientNormalization, getGradientNormalizationThreshold, getMemoryReport, getOutputType, getPreProcessorForInputType, setNIn
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer
initializeConstraints, resetLayerDefaultConfig, setDataType
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getLayerName
-
-
-
-
Constructor Detail
-
FrozenLayerWithBackprop
public FrozenLayerWithBackprop(Layer layer)
-
-
Method Detail
-
getInnerConf
public NeuralNetConfiguration getInnerConf(NeuralNetConfiguration conf)
-
instantiate
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
- Specified by:
instantiate
in classLayer
-
initializer
public ParamInitializer initializer()
- Overrides:
initializer
in classBaseWrapperLayer
- Returns:
- The parameter initializer for this model
-
getRegularizationByParam
public List<Regularization> getRegularizationByParam(String paramName)
Description copied from class:Layer
Get the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParam
in interfaceTrainingConfig
- Overrides:
getRegularizationByParam
in classBaseWrapperLayer
- Parameters:
paramName
- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from class:Layer
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParam
in interfaceTrainingConfig
- Overrides:
isPretrainParam
in classBaseWrapperLayer
- Parameters:
paramName
- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Description copied from class:Layer
Get the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParam
in interfaceTrainingConfig
- Overrides:
getUpdaterByParam
in classLayer
- Parameters:
paramName
- Parameter name- Returns:
- IUpdater for the parameter
-
setLayerName
public void setLayerName(String layerName)
- Overrides:
setLayerName
in classBaseWrapperLayer
-
setConstraints
public void setConstraints(List<LayerConstraint> constraints)
-
-