Class DenseLayer
- java.lang.Object
-
- org.deeplearning4j.nn.layers.AbstractLayer<LayerConfT>
-
- org.deeplearning4j.nn.layers.BaseLayer<DenseLayer>
-
- org.deeplearning4j.nn.layers.feedforward.dense.DenseLayer
-
- All Implemented Interfaces:
Serializable,Cloneable,Layer,Model,Trainable
public class DenseLayer extends BaseLayer<DenseLayer>
- Author:
- Adam Gibson
- See Also:
- Serialized Form
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from interface org.deeplearning4j.nn.api.Layer
Layer.TrainingMode, Layer.Type
-
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.layers.BaseLayer
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
-
Fields inherited from class org.deeplearning4j.nn.layers.AbstractLayer
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
-
-
Constructor Summary
Constructors Constructor Description DenseLayer(NeuralNetConfiguration conf, DataType dataType)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description voidfit(INDArray input, LayerWorkspaceMgr workspaceMgr)Fit the model to the given databooleanhasBias()Does this layer have no bias term? Many layers (dense, convolutional, output, embedding) have biases by default, but no-bias versions are possible via configurationbooleanhasLayerNorm()Does this layer support and is it enabled layer normalization? Only Dense and SimpleRNN Layers support layer normalization.booleanisPretrainLayer()Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)-
Methods inherited from class org.deeplearning4j.nn.layers.BaseLayer
activate, backpropGradient, calcRegularizationScore, clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, layerConf, numParams, params, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
-
Methods inherited from class org.deeplearning4j.nn.layers.AbstractLayer
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, feedForwardMaskArray, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, updaterDivideByMinibatch
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.Layer
getIterationCount, setIterationCount
-
-
-
-
Constructor Detail
-
DenseLayer
public DenseLayer(NeuralNetConfiguration conf, DataType dataType)
-
-
Method Detail
-
fit
public void fit(INDArray input, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:ModelFit the model to the given data- Specified by:
fitin interfaceModel- Overrides:
fitin classBaseLayer<DenseLayer>- Parameters:
input- the data to fit the model to
-
isPretrainLayer
public boolean isPretrainLayer()
Description copied from interface:LayerReturns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)- Returns:
- true if the layer can be pretrained (using fit(INDArray), false otherwise
-
hasBias
public boolean hasBias()
Description copied from class:BaseLayerDoes this layer have no bias term? Many layers (dense, convolutional, output, embedding) have biases by default, but no-bias versions are possible via configuration- Overrides:
hasBiasin classBaseLayer<DenseLayer>- Returns:
- True if a bias term is present, false otherwise
-
hasLayerNorm
public boolean hasLayerNorm()
Description copied from class:BaseLayerDoes this layer support and is it enabled layer normalization? Only Dense and SimpleRNN Layers support layer normalization.- Overrides:
hasLayerNormin classBaseLayer<DenseLayer>- Returns:
- True if layer normalization is enabled on this layer, false otherwise
-
-