public class RNN extends RecurrentBlock
RNN
is an implementation of recurrent neural networks which applies a single-gate
recurrent layer to input. Two kinds of activation function are supported: ReLU and Tanh.
Current implementation refers the [paper](https://crl.ucsd.edu/~elman/Papers/fsit.pdf), Finding structure in time - Elman, 1988.
The RNN operator is formulated as below:
With ReLU activation function: \(h_t = relu(W_{ih} * x_t + b_{ih} + W_{hh} * h_{(t-1)} + b_{hh})\)
With Tanh activation function: \(h_t = \tanh(W_{ih} * x_t + b_{ih} + W_{hh} * h_{(t-1)} + b_{hh})\)
Modifier and Type | Class and Description |
---|---|
static class |
RNN.Activation
An enum that enumerates the type of activation.
|
static class |
RNN.Builder
|
RecurrentBlock.BaseBuilder<T extends RecurrentBlock.BaseBuilder>
beginState, dropRate, gates, mode, numDirections, numStackedLayers, stateOutputs, stateSize, useSequenceLength
children, inputNames, inputShapes, parameters, parameterShapeCallbacks, version
Modifier and Type | Method and Description |
---|---|
static RNN.Builder |
builder()
Creates a builder to build a
RNN . |
beforeInitialize, forward, getOutputShapes, getParameterShape, isBidirectional, loadMetadata, opInputs, resetBeginStates, setBeginStates, setStateOutputs, updateInputLayoutToTNC, validateInputSize
addChildBlock, addParameter, addParameter, addParameter, cast, clear, describeInput, getChildren, getDirectParameters, getParameters, initialize, initializeChildBlocks, isInitialized, loadParameters, readInputShapes, saveInputShapes, saveMetadata, saveParameters, setInitializer, setInitializer, toString
clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
forward, forward, validateLayout
public static RNN.Builder builder()
RNN
.