Class AbstractSameDiffLayer.Builder<T extends AbstractSameDiffLayer.Builder<T>>
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.samediff.AbstractSameDiffLayer.Builder<T>
-
- Direct Known Subclasses:
SameDiffLayer.Builder
- Enclosing class:
- AbstractSameDiffLayer
public abstract static class AbstractSameDiffLayer.Builder<T extends AbstractSameDiffLayer.Builder<T>> extends Layer.Builder<T>
-
-
Field Summary
Fields Modifier and Type Field Description protected IUpdater
biasUpdater
Gradient updater configuration, for the biases only.protected List<Regularization>
regularization
protected List<Regularization>
regularizationBias
protected IUpdater
updater
Gradient updater.-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description T
biasUpdater(IUpdater biasUpdater)
Gradient updater configuration, for the biases only.T
l1(double l1)
L1 regularization coefficient (weights only).T
l1Bias(double l1Bias)
L1 regularization coefficient for the bias.T
l2(double l2)
L2 regularization coefficient (weights only).T
l2Bias(double l2Bias)
L2 regularization coefficient for the bias.AbstractSameDiffLayer.Builder
regularization(List<Regularization> regularization)
Set the regularization for the parameters (excluding biases) - for exampleWeightDecay
AbstractSameDiffLayer.Builder
regularizationBias(List<Regularization> regularizationBias)
Set the regularization for the biases only - for exampleWeightDecay
T
updater(IUpdater updater)
Gradient updater.AbstractSameDiffLayer.Builder
weightDecay(double coefficient)
Add weight decay regularization for the network parameters (excluding biases).
This applies weight decay with multiplying the learning rate - seeWeightDecay
for more details.AbstractSameDiffLayer.Builder
weightDecay(double coefficient, boolean applyLR)
Add weight decay regularization for the network parameters (excluding biases).AbstractSameDiffLayer.Builder
weightDecayBias(double coefficient)
Weight decay for the biases only - seeweightDecay(double)
for more details.AbstractSameDiffLayer.Builder
weightDecayBias(double coefficient, boolean applyLR)
Weight decay for the biases only - seeweightDecay(double)
for more details-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
build, constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
-
-
-
-
Field Detail
-
regularization
protected List<Regularization> regularization
-
regularizationBias
protected List<Regularization> regularizationBias
-
updater
protected IUpdater updater
-
biasUpdater
protected IUpdater biasUpdater
Gradient updater configuration, for the biases only. If not set, biases will use the updater as set byupdater(IUpdater)
-
-
Method Detail
-
l1
public T l1(double l1)
L1 regularization coefficient (weights only). Usel1Bias(double)
to configure the l1 regularization coefficient for the bias.
-
l2
public T l2(double l2)
L2 regularization coefficient (weights only). Usel2Bias(double)
to configure the l2 regularization coefficient for the bias.
Note: Generally,WeightDecay
(set viaweightDecay(double,boolean)
should be preferred to L2 regularization. SeeWeightDecay
javadoc for further details.
-
l1Bias
public T l1Bias(double l1Bias)
L1 regularization coefficient for the bias. Default: 0. See alsol1(double)
-
l2Bias
public T l2Bias(double l2Bias)
L2 regularization coefficient for the bias. Default: 0. See alsol2(double)
Note: Generally,WeightDecay
(set viaweightDecayBias(double,boolean)
should be preferred to L2 regularization. SeeWeightDecay
javadoc for further details.
-
weightDecay
public AbstractSameDiffLayer.Builder weightDecay(double coefficient)
Add weight decay regularization for the network parameters (excluding biases).
This applies weight decay with multiplying the learning rate - seeWeightDecay
for more details.- Parameters:
coefficient
- Weight decay regularization coefficient- See Also:
weightDecay(double, boolean)
-
weightDecay
public AbstractSameDiffLayer.Builder weightDecay(double coefficient, boolean applyLR)
Add weight decay regularization for the network parameters (excluding biases). SeeWeightDecay
for more details.- Parameters:
coefficient
- Weight decay regularization coefficientapplyLR
- Whether the learning rate should be multiplied in when performing weight decay updates. SeeWeightDecay
for more details.- See Also:
weightDecay(double, boolean)
-
weightDecayBias
public AbstractSameDiffLayer.Builder weightDecayBias(double coefficient)
Weight decay for the biases only - seeweightDecay(double)
for more details. This applies weight decay with multiplying the learning rate.- Parameters:
coefficient
- Weight decay regularization coefficient- See Also:
weightDecayBias(double, boolean)
-
weightDecayBias
public AbstractSameDiffLayer.Builder weightDecayBias(double coefficient, boolean applyLR)
Weight decay for the biases only - seeweightDecay(double)
for more details- Parameters:
coefficient
- Weight decay regularization coefficient
-
regularization
public AbstractSameDiffLayer.Builder regularization(List<Regularization> regularization)
Set the regularization for the parameters (excluding biases) - for exampleWeightDecay
- Parameters:
regularization
- Regularization to apply for the network parameters/weights (excluding biases)
-
regularizationBias
public AbstractSameDiffLayer.Builder regularizationBias(List<Regularization> regularizationBias)
Set the regularization for the biases only - for exampleWeightDecay
- Parameters:
regularizationBias
- Regularization to apply for the network biases only
-
biasUpdater
public T biasUpdater(IUpdater biasUpdater)
Gradient updater configuration, for the biases only. If not set, biases will use the updater as set byupdater(IUpdater)
- Parameters:
biasUpdater
- Updater to use for bias parameters
-
-