public interface TrainingConfig
Modifier and Type | Method and Description |
---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
String |
getLayerName() |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setDataType(DataType dataType) |
String getLayerName()
List<Regularization> getRegularizationByParam(String paramName)
paramName
- Parameter name ("W", "b" etc)boolean isPretrainParam(String paramName)
paramName
- Parameter name/keyIUpdater getUpdaterByParam(String paramName)
paramName
- Parameter nameGradientNormalization getGradientNormalization()
double getGradientNormalizationThreshold()
void setDataType(DataType dataType)
Copyright © 2020. All rights reserved.