public abstract static class AbstractLSTM.Builder<T extends AbstractLSTM.Builder<T>> extends BaseRecurrentLayer.Builder<T>
| Modifier and Type | Field and Description |
|---|---|
protected double |
forgetGateBiasInit
Set forget gate bias initalizations.
|
protected IActivation |
gateActivationFn
Activation function for the LSTM gates.
|
protected boolean |
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed?
If set to false, an exception in CuDNN will be propagated back to the user.
|
inputWeightConstraints, recurrentConstraints, rnnDataFormat, weightInitFnRecurrentnIn, nOutactivationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoiseallParamConstraints, biasConstraints, iDropout, layerName, weightConstraints| Constructor and Description |
|---|
Builder() |
| Modifier and Type | Method and Description |
|---|---|
T |
forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations.
|
T |
gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates.
|
T |
gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates.
|
T |
gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates.
|
T |
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed?
If set to false, an exception in the helper will be propagated back to the user.
|
constrainInputWeights, constrainRecurrent, dataFormat, weightInitRecurrent, weightInitRecurrent, weightInitRecurrentnIn, nIn, nOut, nOut, unitsactivation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoisebuild, constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, nameprotected double forgetGateBiasInit
protected IActivation gateActivationFn
protected boolean helperAllowFallback
public T forgetGateBiasInit(double biasInit)
public T gateActivationFunction(String gateActivationFn)
gateActivationFn - Activation function for the LSTM gatespublic T gateActivationFunction(Activation gateActivationFn)
gateActivationFn - Activation function for the LSTM gatespublic T gateActivationFunction(IActivation gateActivationFn)
gateActivationFn - Activation function for the LSTM gatespublic T helperAllowFallback(boolean allowFallback)
allowFallback - Whether fallback to non-helper implementation should be usedCopyright © 2020. All rights reserved.