public class DropoutLayer extends BaseLayer<DropoutLayer>
Layer.TrainingMode, Layer.Typegradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParamscacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
DropoutLayer(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
void |
fit(INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Layer.Type |
type()
Returns the layer type
|
clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasBias, hasLayerNorm, layerConf, numParams, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, updateactivate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, feedForwardMaskArray, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, updaterDivideByMinibatchequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitgetIterationCount, setIterationCountpublic DropoutLayer(NeuralNetConfiguration conf, DataType dataType)
public double calcRegularizationScore(boolean backpropParamsOnly)
LayercalcRegularizationScore in interface LayercalcRegularizationScore in class BaseLayer<DropoutLayer>backpropParamsOnly - If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<DropoutLayer>public void fit(INDArray input, LayerWorkspaceMgr workspaceMgr)
Modelfit in interface Modelfit in class BaseLayer<DropoutLayer>input - the data to fit the model topublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<DropoutLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseLayer<DropoutLayer>training - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic boolean isPretrainLayer()
LayerCopyright © 2020. All rights reserved.