public static class RecurrentAttentionLayer.Builder extends SameDiffLayer.Builder<RecurrentAttentionLayer.Builder>
paramWeightInit, weightInitbiasUpdater, regularization, regularizationBias, updaterallParamConstraints, biasConstraints, iDropout, layerName, weightConstraints| Constructor and Description |
|---|
Builder() |
| Modifier and Type | Method and Description |
|---|---|
RecurrentAttentionLayer.Builder |
activation(Activation activation) |
RecurrentAttentionLayer |
build() |
RecurrentAttentionLayer.Builder |
hasBias(boolean hasBias) |
RecurrentAttentionLayer.Builder |
headSize(int headSize)
Size of attention heads
|
RecurrentAttentionLayer.Builder |
nHeads(int nHeads)
Number of Attention Heads
|
RecurrentAttentionLayer.Builder |
nIn(int nIn) |
RecurrentAttentionLayer.Builder |
nOut(int nOut) |
RecurrentAttentionLayer.Builder |
projectInput(boolean projectInput)
Project input before applying attention or not.
|
weightInit, weightInitbiasUpdater, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBiasconstrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, namepublic RecurrentAttentionLayer.Builder nIn(int nIn)
nIn - Number of inputs to the layer (input size)public RecurrentAttentionLayer.Builder nOut(int nOut)
nOut - Number of outputs (output size)public RecurrentAttentionLayer.Builder nHeads(int nHeads)
public RecurrentAttentionLayer.Builder headSize(int headSize)
public RecurrentAttentionLayer.Builder projectInput(boolean projectInput)
public RecurrentAttentionLayer.Builder hasBias(boolean hasBias)
hasBias - If true (default is true) the layer will have a biaspublic RecurrentAttentionLayer.Builder activation(Activation activation)
activation - Activation function for the layerpublic RecurrentAttentionLayer build()
build in class Layer.Builder<RecurrentAttentionLayer.Builder>Copyright © 2020. All rights reserved.