Class BaseRecurrentLayer.Builder<T extends BaseRecurrentLayer.Builder<T>>
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.BaseLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder<T>
-
- Direct Known Subclasses:
AbstractLSTM.Builder
,GravesBidirectionalLSTM.Builder
,SimpleRnn.Builder
- Enclosing class:
- BaseRecurrentLayer
public abstract static class BaseRecurrentLayer.Builder<T extends BaseRecurrentLayer.Builder<T>> extends FeedForwardLayer.Builder<T>
-
-
Field Summary
Fields Modifier and Type Field Description protected List<LayerConstraint>
inputWeightConstraints
Set constraints to be applied to the RNN input weight parameters of this layer.protected List<LayerConstraint>
recurrentConstraints
Set constraints to be applied to the RNN recurrent weight parameters of this layer.protected RNNFormat
rnnDataFormat
Set the format of data expected by the RNN.protected IWeightInit
weightInitFnRecurrent
Set the weight initialization for the recurrent weights.-
Fields inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder
nIn, nOut
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer.Builder
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoise
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description T
constrainInputWeights(LayerConstraint... constraints)
Set constraints to be applied to the RNN input weight parameters of this layer.T
constrainRecurrent(LayerConstraint... constraints)
Set constraints to be applied to the RNN recurrent weight parameters of this layer.T
dataFormat(RNNFormat rnnDataFormat)
T
weightInitRecurrent(Distribution dist)
Set the weight initialization for the recurrent weights, based on the specified distribution.T
weightInitRecurrent(IWeightInit weightInit)
Set the weight initialization for the recurrent weights.T
weightInitRecurrent(WeightInit weightInit)
Set the weight initialization for the recurrent weights.-
Methods inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder
nIn, nIn, nOut, nOut, units
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer.Builder
activation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoise
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
build, constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
-
-
-
-
Field Detail
-
rnnDataFormat
protected RNNFormat rnnDataFormat
Set the format of data expected by the RNN. NCW = [miniBatchSize, size, timeSeriesLength], NWC = [miniBatchSize, timeSeriesLength, size]. Defaults to NCW.
-
recurrentConstraints
protected List<LayerConstraint> recurrentConstraints
Set constraints to be applied to the RNN recurrent weight parameters of this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.
-
inputWeightConstraints
protected List<LayerConstraint> inputWeightConstraints
Set constraints to be applied to the RNN input weight parameters of this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.
-
weightInitFnRecurrent
protected IWeightInit weightInitFnRecurrent
Set the weight initialization for the recurrent weights. Not that if this is not set explicitly, the same weight initialization as the layer input weights is also used for the recurrent weights.
-
-
Method Detail
-
constrainRecurrent
public T constrainRecurrent(LayerConstraint... constraints)
Set constraints to be applied to the RNN recurrent weight parameters of this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.- Parameters:
constraints
- Constraints to apply to the recurrent weight parameters of this layer
-
constrainInputWeights
public T constrainInputWeights(LayerConstraint... constraints)
Set constraints to be applied to the RNN input weight parameters of this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.- Parameters:
constraints
- Constraints to apply to the input weight parameters of this layer
-
weightInitRecurrent
public T weightInitRecurrent(IWeightInit weightInit)
Set the weight initialization for the recurrent weights. Not that if this is not set explicitly, the same weight initialization as the layer input weights is also used for the recurrent weights.- Parameters:
weightInit
- Weight initialization for the recurrent weights only.
-
weightInitRecurrent
public T weightInitRecurrent(WeightInit weightInit)
Set the weight initialization for the recurrent weights. Not that if this is not set explicitly, the same weight initialization as the layer input weights is also used for the recurrent weights.- Parameters:
weightInit
- Weight initialization for the recurrent weights only.
-
weightInitRecurrent
public T weightInitRecurrent(Distribution dist)
Set the weight initialization for the recurrent weights, based on the specified distribution. Not that if this is not set explicitly, the same weight initialization as the layer input weights is also used for the recurrent weights.- Parameters:
dist
- Distribution to use for initializing the recurrent weights
-
-