public class RecurrentAttentionLayer extends SameDiffLayer
| Modifier and Type | Class and Description |
|---|---|
static class |
RecurrentAttentionLayer.Builder |
paramWeightInit, weightInitbiasUpdater, gradientNormalization, gradientNormalizationThreshold, regularization, regularizationBias, updaterconstraints, iDropout, layerName| Modifier | Constructor and Description |
|---|---|
protected |
RecurrentAttentionLayer(RecurrentAttentionLayer.Builder builder) |
| Modifier and Type | Method and Description |
|---|---|
void |
applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig) |
SDVariable |
defineLayer(SameDiff sameDiff,
SDVariable layerInput,
Map<String,SDVariable> paramTable,
SDVariable mask)
Define the layer
|
void |
defineParameters(SDLayerParams params)
Define the parameters for the network.
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessor |
void |
initializeParameters(Map<String,INDArray> params)
Set the initial parameter values for this layer, if required
|
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
void |
validateInput(INDArray input)
Validate input arrays to confirm that they fulfill the assumptions of the layer.
|
feedForwardMaskArray, instantiateapplyGlobalConfig, getLayerParams, getMemoryReport, getRegularizationByParam, getUpdaterByParam, initializer, initWeights, isPretrainParam, onesMaskForInput, paramReshapeOrderclone, initializeConstraints, resetLayerDefaultConfig, setDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetGradientNormalization, getGradientNormalizationThreshold, getLayerNameprotected RecurrentAttentionLayer(RecurrentAttentionLayer.Builder builder)
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
LayerInputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessorgetPreProcessorForInputType in class AbstractSameDiffLayerinputType - InputType to this layerpublic void setNIn(InputType inputType, boolean override)
LayersetNIn in class AbstractSameDiffLayerinputType - Input type for this layeroverride - If false: only set the nIn value if it's not already set. If true: set it
regardless of whether it's already set or not.public InputType getOutputType(int layerIndex, InputType inputType)
LayergetOutputType in class LayerlayerIndex - Index of the layerinputType - Type of input for the layerpublic void defineParameters(SDLayerParams params)
AbstractSameDiffLayerSDLayerParams.addWeightParam(String, long...) and SDLayerParams.addBiasParam(String, long...)defineParameters in class AbstractSameDiffLayerparams - Object used to set parameters for this layerpublic void initializeParameters(Map<String,INDArray> params)
AbstractSameDiffLayerinitializeParameters in class AbstractSameDiffLayerparams - Parameter arrays that may be initializedpublic void applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
applyGlobalConfigToLayer in class AbstractSameDiffLayerpublic void validateInput(INDArray input)
SameDiffLayervalidateInput in class SameDiffLayerinput - input to the layerpublic SDVariable defineLayer(SameDiff sameDiff, SDVariable layerInput, Map<String,SDVariable> paramTable, SDVariable mask)
SameDiffLayerdefineLayer in class SameDiffLayersameDiff - SameDiff instancelayerInput - Input to the layerparamTable - Parameter table - keys as defined by AbstractSameDiffLayer.defineParameters(SDLayerParams)mask - Optional, maybe null. Mask to apply if supportedCopyright © 2022. All rights reserved.