Package org.deeplearning4j.nn.conf.misc
Class DummyConfig
- java.lang.Object
-
- org.deeplearning4j.nn.conf.misc.DummyConfig
-
- All Implemented Interfaces:
TrainingConfig
public class DummyConfig extends Object implements TrainingConfig
-
-
Constructor Summary
Constructors Constructor Description DummyConfig()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description GradientNormalization
getGradientNormalization()
double
getGradientNormalizationThreshold()
String
getLayerName()
List<Regularization>
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdater
getUpdaterByParam(String paramName)
Get the updater for the given parameter.boolean
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.void
setDataType(DataType dataType)
-
-
-
Method Detail
-
getLayerName
public String getLayerName()
- Specified by:
getLayerName
in interfaceTrainingConfig
- Returns:
- Name of the layer
-
getRegularizationByParam
public List<Regularization> getRegularizationByParam(String paramName)
Description copied from interface:TrainingConfig
Get the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParam
in interfaceTrainingConfig
- Parameters:
paramName
- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from interface:TrainingConfig
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParam
in interfaceTrainingConfig
- Parameters:
paramName
- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Description copied from interface:TrainingConfig
Get the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParam
in interfaceTrainingConfig
- Parameters:
paramName
- Parameter name- Returns:
- IUpdater for the parameter
-
getGradientNormalization
public GradientNormalization getGradientNormalization()
- Specified by:
getGradientNormalization
in interfaceTrainingConfig
- Returns:
- The gradient normalization configuration
-
getGradientNormalizationThreshold
public double getGradientNormalizationThreshold()
- Specified by:
getGradientNormalizationThreshold
in interfaceTrainingConfig
- Returns:
- The gradient normalization threshold
-
setDataType
public void setDataType(DataType dataType)
- Specified by:
setDataType
in interfaceTrainingConfig
-
-