public class SimpleRnn extends BaseRecurrentLayer
| Modifier and Type | Class and Description |
|---|---|
static class |
SimpleRnn.Builder |
rnnDataFormat, weightInitFnRecurrentnIn, nOut, timeDistributedFormatactivationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoiseconstraints, iDropout, layerName| Modifier | Constructor and Description |
|---|---|
protected |
SimpleRnn(SimpleRnn.Builder builder) |
| Modifier and Type | Method and Description |
|---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
boolean |
hasLayerNorm() |
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType) |
getOutputType, getPreProcessorForInputType, setNInisPretrainParamclone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfiginitializeConstraints, setDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetGradientNormalizationThreshold, getLayerNameprotected SimpleRnn(SimpleRnn.Builder builder)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate in class Layerpublic ParamInitializer initializer()
initializer in class Layerpublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class LayerinputType - Input type to the layer. Memory consumption is often a function of the input
typepublic boolean hasLayerNorm()
Copyright © 2022. All rights reserved.