public interface TrainingConfig
Modifier and Type | Method and Description |
---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
String |
getLayerName() |
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
boolean |
isPretrain() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setPretrain(boolean pretrain) |
String getLayerName()
boolean isPretrain()
double getL1ByParam(String paramName)
paramName
- Parameter namedouble getL2ByParam(String paramName)
paramName
- Parameter nameboolean isPretrainParam(String paramName)
paramName
- Parameter name/keyIUpdater getUpdaterByParam(String paramName)
paramName
- Parameter nameGradientNormalization getGradientNormalization()
double getGradientNormalizationThreshold()
void setPretrain(boolean pretrain)
pretrain
- Whether the layer is currently undergoing layerwise unsupervised training, or multi-layer backpropCopyright © 2018. All rights reserved.