public class LSTM extends AbstractLSTM
| Modifier and Type | Class and Description |
|---|---|
static class |
LSTM.Builder |
distRecurrent, weightInitRecurrentnIn, nOutactivationFn, biasInit, biasUpdater, dist, gradientNormalization, gradientNormalizationThreshold, iUpdater, l1, l1Bias, l2, l2Bias, weightInit, weightNoiseconstraints, iDropout, layerName| Modifier and Type | Method and Description |
|---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
protected void |
initializeConstraints(Layer.Builder<?> builder)
Initialize the weight constraints.
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
org.nd4j.linalg.api.ndarray.INDArray layerParamsView,
boolean initializeParams) |
getL1ByParam, getL2ByParamgetOutputType, getPreProcessorForInputType, setNInisPretrainParamclone, getUpdaterByParam, resetLayerDefaultConfigprotected void initializeConstraints(Layer.Builder<?> builder)
LayerinitializeConstraints in class Layerpublic Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, org.nd4j.linalg.api.ndarray.INDArray layerParamsView, boolean initializeParams)
instantiate in class Layerpublic ParamInitializer initializer()
initializer in class Layerpublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class LayerinputType - Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2018. All rights reserved.