Package org.deeplearning4j.nn.conf.misc
Class DummyConfig
- java.lang.Object
-
- org.deeplearning4j.nn.conf.misc.DummyConfig
-
- All Implemented Interfaces:
TrainingConfig
public class DummyConfig extends Object implements TrainingConfig
-
-
Constructor Summary
Constructors Constructor Description DummyConfig()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description GradientNormalizationgetGradientNormalization()doublegetGradientNormalizationThreshold()StringgetLayerName()List<Regularization>getRegularizationByParam(String paramName)Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdatergetUpdaterByParam(String paramName)Get the updater for the given parameter.booleanisPretrainParam(String paramName)Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.voidsetDataType(DataType dataType)
-
-
-
Method Detail
-
getLayerName
public String getLayerName()
- Specified by:
getLayerNamein interfaceTrainingConfig- Returns:
- Name of the layer
-
getRegularizationByParam
public List<Regularization> getRegularizationByParam(String paramName)
Description copied from interface:TrainingConfigGet the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParamin interfaceTrainingConfig- Parameters:
paramName- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from interface:TrainingConfigIs the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParamin interfaceTrainingConfig- Parameters:
paramName- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Description copied from interface:TrainingConfigGet the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParamin interfaceTrainingConfig- Parameters:
paramName- Parameter name- Returns:
- IUpdater for the parameter
-
getGradientNormalization
public GradientNormalization getGradientNormalization()
- Specified by:
getGradientNormalizationin interfaceTrainingConfig- Returns:
- The gradient normalization configuration
-
getGradientNormalizationThreshold
public double getGradientNormalizationThreshold()
- Specified by:
getGradientNormalizationThresholdin interfaceTrainingConfig- Returns:
- The gradient normalization threshold
-
setDataType
public void setDataType(DataType dataType)
- Specified by:
setDataTypein interfaceTrainingConfig
-
-