Class FrozenLayerWithBackprop
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- org.deeplearning4j.nn.conf.layers.wrapper.BaseWrapperLayer
-
- org.deeplearning4j.nn.conf.layers.misc.FrozenLayerWithBackprop
-
- All Implemented Interfaces:
Serializable,Cloneable,TrainingConfig
public class FrozenLayerWithBackprop extends BaseWrapperLayer
Frozen layer freezes parameters of the layer it wraps, but allows the backpropagation to continue.- Author:
- Ugljesa Jovanovic ([email protected]) on 06/05/2018.
- See Also:
FrozenLayer, Serialized Form
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from class org.deeplearning4j.nn.conf.layers.Layer
Layer.Builder<T extends Layer.Builder<T>>
-
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.wrapper.BaseWrapperLayer
underlying
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer
constraints, iDropout, layerName
-
-
Constructor Summary
Constructors Constructor Description FrozenLayerWithBackprop(Layer layer)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description Layerclone()NeuralNetConfigurationgetInnerConf(NeuralNetConfiguration conf)List<Regularization>getRegularizationByParam(String paramName)Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdatergetUpdaterByParam(String paramName)Get the updater for the given parameter.ParamInitializerinitializer()Layerinstantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)booleanisPretrainParam(String paramName)Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.voidsetConstraints(List<LayerConstraint> constraints)voidsetLayerName(String layerName)-
Methods inherited from class org.deeplearning4j.nn.conf.layers.wrapper.BaseWrapperLayer
getGradientNormalization, getGradientNormalizationThreshold, getMemoryReport, getOutputType, getPreProcessorForInputType, setNIn
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer
initializeConstraints, resetLayerDefaultConfig, setDataType
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getLayerName
-
-
-
-
Constructor Detail
-
FrozenLayerWithBackprop
public FrozenLayerWithBackprop(Layer layer)
-
-
Method Detail
-
getInnerConf
public NeuralNetConfiguration getInnerConf(NeuralNetConfiguration conf)
-
instantiate
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
- Specified by:
instantiatein classLayer
-
initializer
public ParamInitializer initializer()
- Overrides:
initializerin classBaseWrapperLayer- Returns:
- The parameter initializer for this model
-
getRegularizationByParam
public List<Regularization> getRegularizationByParam(String paramName)
Description copied from class:LayerGet the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParamin interfaceTrainingConfig- Overrides:
getRegularizationByParamin classBaseWrapperLayer- Parameters:
paramName- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from class:LayerIs the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParamin interfaceTrainingConfig- Overrides:
isPretrainParamin classBaseWrapperLayer- Parameters:
paramName- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Description copied from class:LayerGet the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParamin interfaceTrainingConfig- Overrides:
getUpdaterByParamin classLayer- Parameters:
paramName- Parameter name- Returns:
- IUpdater for the parameter
-
setLayerName
public void setLayerName(String layerName)
- Overrides:
setLayerNamein classBaseWrapperLayer
-
setConstraints
public void setConstraints(List<LayerConstraint> constraints)
-
-