public class DummyConfig extends Object implements TrainingConfig
| Constructor and Description |
|---|
DummyConfig() |
| Modifier and Type | Method and Description |
|---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
String |
getLayerName() |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setDataType(DataType dataType) |
public String getLayerName()
getLayerName in interface TrainingConfigpublic List<Regularization> getRegularizationByParam(String paramName)
TrainingConfiggetRegularizationByParam in interface TrainingConfigparamName - Parameter name ("W", "b" etc)public boolean isPretrainParam(String paramName)
TrainingConfigisPretrainParam in interface TrainingConfigparamName - Parameter name/keypublic IUpdater getUpdaterByParam(String paramName)
TrainingConfiggetUpdaterByParam in interface TrainingConfigparamName - Parameter namepublic GradientNormalization getGradientNormalization()
getGradientNormalization in interface TrainingConfigpublic double getGradientNormalizationThreshold()
getGradientNormalizationThreshold in interface TrainingConfigpublic void setDataType(DataType dataType)
setDataType in interface TrainingConfigCopyright © 2022. All rights reserved.