public class SelfAttentionLayer extends SameDiffLayer
LearnedSelfAttentionLayer
,
RecurrentAttentionLayer
,
MultiHeadDotProductAttention
,
Serialized FormModifier and Type | Class and Description |
---|---|
static class |
SelfAttentionLayer.Builder |
paramWeightInit, weightInit
biasUpdater, gradientNormalization, gradientNormalizationThreshold, regularization, regularizationBias, updater
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
SelfAttentionLayer(SelfAttentionLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
SDVariable |
defineLayer(SameDiff sameDiff,
SDVariable layerInput,
Map<String,SDVariable> paramTable,
SDVariable mask)
Define the layer
|
void |
defineParameters(SDLayerParams params)
Define the parameters for the network.
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessor |
void |
initializeParameters(Map<String,INDArray> params)
Set the initial parameter values for this layer, if required
|
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
feedForwardMaskArray, instantiate, validateInput
applyGlobalConfig, applyGlobalConfigToLayer, getLayerParams, getMemoryReport, getRegularizationByParam, getUpdaterByParam, initializer, initWeights, isPretrainParam, onesMaskForInput, paramReshapeOrder
clone, initializeConstraints, resetLayerDefaultConfig, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalization, getGradientNormalizationThreshold, getLayerName
protected SelfAttentionLayer(SelfAttentionLayer.Builder builder)
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class AbstractSameDiffLayer
inputType
- InputType to this layerpublic void setNIn(InputType inputType, boolean override)
Layer
setNIn
in class AbstractSameDiffLayer
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it
regardless of whether it's already set or not.public InputType getOutputType(int layerIndex, InputType inputType)
Layer
getOutputType
in class Layer
layerIndex
- Index of the layerinputType
- Type of input for the layerpublic void defineParameters(SDLayerParams params)
AbstractSameDiffLayer
SDLayerParams.addWeightParam(String, long...)
and SDLayerParams.addBiasParam(String, long...)
defineParameters
in class AbstractSameDiffLayer
params
- Object used to set parameters for this layerpublic void initializeParameters(Map<String,INDArray> params)
AbstractSameDiffLayer
initializeParameters
in class AbstractSameDiffLayer
params
- Parameter arrays that may be initializedpublic SDVariable defineLayer(SameDiff sameDiff, SDVariable layerInput, Map<String,SDVariable> paramTable, SDVariable mask)
SameDiffLayer
defineLayer
in class SameDiffLayer
sameDiff
- SameDiff instancelayerInput
- Input to the layerparamTable
- Parameter table - keys as defined by AbstractSameDiffLayer.defineParameters(SDLayerParams)
mask
- Optional, maybe null. Mask to apply if supportedCopyright © 2020. All rights reserved.