Class RecurrentAttentionLayer.Builder
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.samediff.AbstractSameDiffLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.samediff.SameDiffLayer.Builder<RecurrentAttentionLayer.Builder>
-
- org.deeplearning4j.nn.conf.layers.RecurrentAttentionLayer.Builder
-
- Enclosing class:
- RecurrentAttentionLayer
public static class RecurrentAttentionLayer.Builder extends SameDiffLayer.Builder<RecurrentAttentionLayer.Builder>
-
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.samediff.SameDiffLayer.Builder
paramWeightInit, weightInit
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.samediff.AbstractSameDiffLayer.Builder
biasUpdater, regularization, regularizationBias, updater
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description RecurrentAttentionLayer.Builder
activation(Activation activation)
RecurrentAttentionLayer
build()
RecurrentAttentionLayer.Builder
hasBias(boolean hasBias)
RecurrentAttentionLayer.Builder
headSize(int headSize)
Size of attention headsRecurrentAttentionLayer.Builder
nHeads(int nHeads)
Number of Attention HeadsRecurrentAttentionLayer.Builder
nIn(int nIn)
RecurrentAttentionLayer.Builder
nOut(int nOut)
RecurrentAttentionLayer.Builder
projectInput(boolean projectInput)
Project input before applying attention or not.-
Methods inherited from class org.deeplearning4j.nn.conf.layers.samediff.SameDiffLayer.Builder
weightInit, weightInit
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.samediff.AbstractSameDiffLayer.Builder
biasUpdater, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
-
-
-
-
Method Detail
-
nIn
public RecurrentAttentionLayer.Builder nIn(int nIn)
- Parameters:
nIn
- Number of inputs to the layer (input size)
-
nOut
public RecurrentAttentionLayer.Builder nOut(int nOut)
- Parameters:
nOut
- Number of outputs (output size)
-
nHeads
public RecurrentAttentionLayer.Builder nHeads(int nHeads)
Number of Attention Heads
-
headSize
public RecurrentAttentionLayer.Builder headSize(int headSize)
Size of attention heads
-
projectInput
public RecurrentAttentionLayer.Builder projectInput(boolean projectInput)
Project input before applying attention or not.
-
hasBias
public RecurrentAttentionLayer.Builder hasBias(boolean hasBias)
- Parameters:
hasBias
- If true (default is true) the layer will have a bias
-
activation
public RecurrentAttentionLayer.Builder activation(Activation activation)
- Parameters:
activation
- Activation function for the layer
-
build
public RecurrentAttentionLayer build()
- Specified by:
build
in classLayer.Builder<RecurrentAttentionLayer.Builder>
-
-