Class Cnn3DLossLayer
- java.lang.Object
-
- org.deeplearning4j.nn.layers.AbstractLayer<LayerConfT>
-
- org.deeplearning4j.nn.layers.BaseLayer<Cnn3DLossLayer>
-
- org.deeplearning4j.nn.layers.convolution.Cnn3DLossLayer
-
- All Implemented Interfaces:
Serializable
,Cloneable
,Classifier
,Layer
,IOutputLayer
,Model
,Trainable
public class Cnn3DLossLayer extends BaseLayer<Cnn3DLossLayer> implements IOutputLayer
- See Also:
- Serialized Form
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from interface org.deeplearning4j.nn.api.Layer
Layer.TrainingMode, Layer.Type
-
-
Field Summary
Fields Modifier and Type Field Description protected INDArray
labels
-
Fields inherited from class org.deeplearning4j.nn.layers.BaseLayer
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
-
Fields inherited from class org.deeplearning4j.nn.layers.AbstractLayer
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
-
-
Constructor Summary
Constructors Constructor Description Cnn3DLossLayer(NeuralNetConfiguration conf, DataType dataType)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description INDArray
activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set inputPair<Gradient,INDArray>
backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layerdouble
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss functiondouble
computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.INDArray
computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.double
f1Score(INDArray examples, INDArray labels)
Returns the f1 score for the given examples.double
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction wrt true labelsPair<INDArray,MaskState>
feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.void
fit(INDArray examples, int[] labels)
Fit the modelvoid
fit(INDArray examples, INDArray labels)
Fit the modelvoid
fit(DataSet data)
Fit the modelvoid
fit(DataSetIterator iter)
Train the model based on the datasetiteratorboolean
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)boolean
needsLabels()
Returns true if labels are required for this output layerint
numLabels()
Returns the number of possible labelsint[]
predict(INDArray examples)
Takes in a list of examples For each row, returns a labelList<String>
predict(DataSet dataSet)
Takes in a DataSet of examples For each row, returns a labelvoid
setMaskArray(INDArray maskArray)
Set the mask array.Layer.Type
type()
Returns the layer type-
Methods inherited from class org.deeplearning4j.nn.layers.BaseLayer
clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasBias, hasLayerNorm, layerConf, numParams, params, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
-
Methods inherited from class org.deeplearning4j.nn.layers.AbstractLayer
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, updaterDivideByMinibatch
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.layers.IOutputLayer
getLabels, setLabels
-
Methods inherited from interface org.deeplearning4j.nn.api.Layer
activate, allowInputModification, clearNoiseWeightParams, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners
-
Methods inherited from interface org.deeplearning4j.nn.api.Model
addListeners, applyConstraints, batchSize, clear, close, computeGradientAndScore, conf, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, gradientAndScore, init, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update
-
Methods inherited from interface org.deeplearning4j.nn.api.Trainable
getConfig, getGradientsViewArray, numParams, params, paramTable, updaterDivideByMinibatch
-
-
-
-
Field Detail
-
labels
protected INDArray labels
-
-
Constructor Detail
-
Cnn3DLossLayer
public Cnn3DLossLayer(NeuralNetConfiguration conf, DataType dataType)
-
-
Method Detail
-
backpropGradient
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:Layer
Calculate the gradient relative to the error in the next layer- Specified by:
backpropGradient
in interfaceLayer
- Overrides:
backpropGradient
in classBaseLayer<Cnn3DLossLayer>
- Parameters:
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C is cost function a=sigma(z) is activation.workspaceMgr
- Workspace manager- Returns:
- Pair
where Gradient is gradient for this layer, INDArray is epsilon (activation gradient) needed by next layer, but before element-wise multiply by sigmaPrime(z). So for standard feed-forward layer, if this layer is L, then return.getSecond() == dL/dIn = (w^(L)*(delta^(L))^T)^T. Note that the returned array should be placed in the ArrayType.ACTIVATION_GRAD
workspace via the workspace manager
-
calcRegularizationScore
public double calcRegularizationScore(boolean backpropParamsOnly)
Description copied from interface:Layer
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function- Specified by:
calcRegularizationScore
in interfaceLayer
- Overrides:
calcRegularizationScore
in classBaseLayer<Cnn3DLossLayer>
- Parameters:
backpropParamsOnly
- If true: calculate regularization score based on backprop params only. If false: calculate based on all params (including pretrain params, if any)- Returns:
- the regularization score of
-
f1Score
public double f1Score(DataSet data)
Description copied from interface:Classifier
Sets the input and labels and returns a score for the prediction wrt true labels- Specified by:
f1Score
in interfaceClassifier
- Parameters:
data
- the data to score- Returns:
- the score for the given input,label pairs
-
f1Score
public double f1Score(INDArray examples, INDArray labels)
Returns the f1 score for the given examples. Think of this to be like a percentage right. The higher the number the more it got right. This is on a scale from 0 to 1.- Specified by:
f1Score
in interfaceClassifier
- Parameters:
examples
- te the examples to classify (one example in each row)labels
- the true labels- Returns:
- the scores for each ndarray
-
numLabels
public int numLabels()
Description copied from interface:Classifier
Returns the number of possible labels- Specified by:
numLabels
in interfaceClassifier
- Returns:
- the number of possible labels for this classifier
-
fit
public void fit(DataSetIterator iter)
Description copied from interface:Classifier
Train the model based on the datasetiterator- Specified by:
fit
in interfaceClassifier
- Parameters:
iter
- the iterator to train on
-
predict
public int[] predict(INDArray examples)
Description copied from interface:Classifier
Takes in a list of examples For each row, returns a label- Specified by:
predict
in interfaceClassifier
- Parameters:
examples
- the examples to classify (one example in each row)- Returns:
- the labels for each example
-
predict
public List<String> predict(DataSet dataSet)
Description copied from interface:Classifier
Takes in a DataSet of examples For each row, returns a label- Specified by:
predict
in interfaceClassifier
- Parameters:
dataSet
- the examples to classify- Returns:
- the labels for each example
-
fit
public void fit(INDArray examples, INDArray labels)
Description copied from interface:Classifier
Fit the model- Specified by:
fit
in interfaceClassifier
- Parameters:
examples
- the examples to classify (one example in each row)labels
- the example labels(a binary outcome matrix)
-
fit
public void fit(DataSet data)
Description copied from interface:Classifier
Fit the model- Specified by:
fit
in interfaceClassifier
- Parameters:
data
- the data to train on
-
fit
public void fit(INDArray examples, int[] labels)
Description copied from interface:Classifier
Fit the model- Specified by:
fit
in interfaceClassifier
- Parameters:
examples
- the examples to classify (one example in each row)labels
- the labels for each example (the number of labels must match the number of rows in the example
-
type
public Layer.Type type()
Description copied from interface:Layer
Returns the layer type- Specified by:
type
in interfaceLayer
- Overrides:
type
in classAbstractLayer<Cnn3DLossLayer>
- Returns:
-
activate
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:Layer
Perform forward pass and return the activations array with the last set input- Specified by:
activate
in interfaceLayer
- Overrides:
activate
in classBaseLayer<Cnn3DLossLayer>
- Parameters:
training
- training or test modeworkspaceMgr
- Workspace manager- Returns:
- the activation (layer output) of the last specified input. Note that the returned array should be placed
in the
ArrayType.ACTIVATIONS
workspace via the workspace manager
-
setMaskArray
public void setMaskArray(INDArray maskArray)
Description copied from interface:Layer
Set the mask array. Note: In general,Layer.feedForwardMaskArray(INDArray, MaskState, int)
should be used in preference to this.- Specified by:
setMaskArray
in interfaceLayer
- Overrides:
setMaskArray
in classAbstractLayer<Cnn3DLossLayer>
- Parameters:
maskArray
- Mask array to set
-
isPretrainLayer
public boolean isPretrainLayer()
Description copied from interface:Layer
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)- Specified by:
isPretrainLayer
in interfaceLayer
- Returns:
- true if the layer can be pretrained (using fit(INDArray), false otherwise
-
feedForwardMaskArray
public Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Description copied from interface:Layer
Feed forward the input mask array, setting in the layer as appropriate. This allows different layers to handle masks differently - for example, bidirectional RNNs and normal RNNs operate differently with masks (the former sets activations to 0 outside of the data present region (and keeps the mask active for future layers like dense layers), whereas normal RNNs don't zero out the activations/errors )instead relying on backpropagated error arrays to handle the variable length case.
This is also used for example for networks that contain global pooling layers, arbitrary preprocessors, etc.- Specified by:
feedForwardMaskArray
in interfaceLayer
- Overrides:
feedForwardMaskArray
in classAbstractLayer<Cnn3DLossLayer>
- Parameters:
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - seeMaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations array due to reshaping (such as a DenseLayer within a recurrent neural network)- Returns:
- New mask array after this layer, along with the new mask state.
-
needsLabels
public boolean needsLabels()
Description copied from interface:IOutputLayer
Returns true if labels are required for this output layer- Specified by:
needsLabels
in interfaceIOutputLayer
- Returns:
- true if this output layer needs labels or not
-
computeScore
public double computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:IOutputLayer
Compute score after labels and input have been set.- Specified by:
computeScore
in interfaceIOutputLayer
- Parameters:
fullNetRegTerm
- Regularization score (l1/l2/weight decay) for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of dropout, etc)- Returns:
- score (loss function)
-
computeScoreForExamples
public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.- Specified by:
computeScoreForExamples
in interfaceIOutputLayer
- Parameters:
fullNetRegTerm
- Regularization score term for the entire network (or, 0.0 to not include regularization)- Returns:
- A column INDArray of shape [numExamples,1], where entry i is the score of the ith example
-
-