Class AbstractLSTM.Builder<T extends AbstractLSTM.Builder<T>>
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.BaseLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.AbstractLSTM.Builder<T>
-
- Direct Known Subclasses:
GravesLSTM.Builder
,LSTM.Builder
- Enclosing class:
- AbstractLSTM
public abstract static class AbstractLSTM.Builder<T extends AbstractLSTM.Builder<T>> extends BaseRecurrentLayer.Builder<T>
-
-
Field Summary
Fields Modifier and Type Field Description protected double
forgetGateBiasInit
Set forget gate bias initalizations.protected IActivation
gateActivationFn
Activation function for the LSTM gates.protected boolean
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed? If set to false, an exception in CuDNN will be propagated back to the user.-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder
inputWeightConstraints, recurrentConstraints, rnnDataFormat, weightInitFnRecurrent
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder
nIn, nOut
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer.Builder
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoise
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description T
forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations.T
gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates.T
gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates.T
gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates.T
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed? If set to false, an exception in the helper will be propagated back to the user.-
Methods inherited from class org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder
constrainInputWeights, constrainRecurrent, dataFormat, weightInitRecurrent, weightInitRecurrent, weightInitRecurrent
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder
nIn, nIn, nOut, nOut, units
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer.Builder
activation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoise
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
build, constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
-
-
-
-
Field Detail
-
forgetGateBiasInit
protected double forgetGateBiasInit
Set forget gate bias initalizations. Values in range 1-5 can potentially help with learning or longer-term dependencies.
-
gateActivationFn
protected IActivation gateActivationFn
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example
-
helperAllowFallback
protected boolean helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed? If set to false, an exception in CuDNN will be propagated back to the user. If false, the built-in (non-CuDNN) implementation for LSTM/GravesLSTM will be used
-
-
Method Detail
-
forgetGateBiasInit
public T forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations. Values in range 1-5 can potentially help with learning or longer-term dependencies.
-
gateActivationFunction
public T gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example- Parameters:
gateActivationFn
- Activation function for the LSTM gates
-
gateActivationFunction
public T gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example- Parameters:
gateActivationFn
- Activation function for the LSTM gates
-
gateActivationFunction
public T gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example- Parameters:
gateActivationFn
- Activation function for the LSTM gates
-
helperAllowFallback
public T helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed? If set to false, an exception in the helper will be propagated back to the user. If false, the built-in (non-helper) implementation for LSTM/GravesLSTM will be used- Parameters:
allowFallback
- Whether fallback to non-helper implementation should be used
-
-