Base class for attention mechanisms.
RNN cell that wraps another RNN cell and adds support for attention to it.
State of the attention wrapper RNN cell.
State of the attention wrapper RNN cell.
Wrapped cell state.
INT32
scalar containing the current time step.
Attention emitted at the previous time step.
Alignments emitted at the previous time step for each attention mechanism.
Alignments emitted at all time steps for each attention mechanism. Call stack()
on
each of the tensor arrays to convert them to tensors.
Attention cell state.
Bahdanau-style (multiplicative) attention scoring.
Bahdanau-style (multiplicative) attention scoring.
This attention has two forms. The first is standard Luong attention, as described in: ["Effective Approaches to Attention-based Neural Machine Translation.", EMNLP 2015](https://arxiv.org/abs/1508.04025).
The second is the scaled form inspired partly by the normalized form of Bahdanau attention. To enable the second
form, construct the object with weightsScale
set to the value of a scalar scaling variable.
This attention has two forms. The first is Bahdanau attention, as described in: ["Neural Machine Translation by Jointly Learning to Align and Translate.", ICLR 2015](https://arxiv.org/abs/1409.0473).
The second is a normalized form inspired by the weight normalization method described in: ["Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks.", NIPS 2016](https://arxiv.org/abs/1602.07868).
Luong-style (multiplicative) attention scoring.
Luong-style (multiplicative) attention scoring.
This attention has two forms. The first is standard Luong attention, as described in: ["Effective Approaches to Attention-based Neural Machine Translation.", EMNLP 2015](https://arxiv.org/abs/1508.04025).
The second is the scaled form inspired partly by the normalized form of Bahdanau attention. To enable the second
form, construct the object with weightsScale
set to the value of a scalar scaling variable.
Base class for attention models that use as state the previous alignment.