public class CnnLossLayer extends BaseLayer<CnnLossLayer> implements IOutputLayer
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected INDArray |
labels |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParamscacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
CnnLossLayer(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
double |
computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
INDArray |
computeScoreForExamples(double fullNetRegTerm,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction
wrt true labels
|
double |
f1Score(INDArray examples,
INDArray labels)
Returns the f1 score for the given examples.
|
Pair<INDArray,MaskState> |
feedForwardMaskArray(INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.
|
void |
fit(DataSet data)
Fit the model
|
void |
fit(DataSetIterator iter)
Train the model based on the datasetiterator
|
void |
fit(INDArray examples,
INDArray labels)
Fit the model
|
void |
fit(INDArray examples,
int[] labels)
Fit the model
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
boolean |
needsLabels()
Returns true if labels are required
for this output layer
|
int |
numLabels()
Returns the number of possible labels
|
List<String> |
predict(DataSet dataSet)
Takes in a DataSet of examples
For each row, returns a label
|
int[] |
predict(INDArray examples)
Takes in a list of examples
For each row, returns a label
|
void |
setMaskArray(INDArray maskArray)
Set the mask array.
|
Layer.Type |
type()
Returns the layer type
|
clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasBias, hasLayerNorm, layerConf, numParams, params, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, updateactivate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, updaterDivideByMinibatchequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitgetLabels, setLabelsactivate, allowInputModification, clearNoiseWeightParams, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListenersgetConfig, getGradientsViewArray, numParams, params, paramTable, updaterDivideByMinibatchaddListeners, applyConstraints, batchSize, clear, close, computeGradientAndScore, conf, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, gradientAndScore, init, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, updateprotected INDArray labels
public CnnLossLayer(NeuralNetConfiguration conf, DataType dataType)
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<CnnLossLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic double calcRegularizationScore(boolean backpropParamsOnly)
LayercalcRegularizationScore in interface LayercalcRegularizationScore in class BaseLayer<CnnLossLayer>backpropParamsOnly - If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double f1Score(DataSet data)
Classifierf1Score in interface Classifierdata - the data to scorepublic double f1Score(INDArray examples, INDArray labels)
f1Score in interface Classifierexamples - te the examples to classify (one example in each row)labels - the true labelspublic int numLabels()
ClassifiernumLabels in interface Classifierpublic void fit(DataSetIterator iter)
Classifierfit in interface Classifieriter - the iterator to train onpublic int[] predict(INDArray examples)
Classifierpredict in interface Classifierexamples - the examples to classify (one example in each row)public List<String> predict(DataSet dataSet)
Classifierpredict in interface ClassifierdataSet - the examples to classifypublic void fit(INDArray examples, INDArray labels)
Classifierfit in interface Classifierexamples - the examples to classify (one example in each row)labels - the example labels(a binary outcome matrix)public void fit(DataSet data)
Classifierfit in interface Classifierdata - the data to train onpublic void fit(INDArray examples, int[] labels)
Classifierfit in interface Classifierexamples - the examples to classify (one example in each row)labels - the labels for each example (the number of labels must match
the number of rows in the examplepublic Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<CnnLossLayer>public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseLayer<CnnLossLayer>training - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic void setMaskArray(INDArray maskArray)
LayerLayer.feedForwardMaskArray(INDArray, MaskState, int) should be used in
preference to this.setMaskArray in interface LayersetMaskArray in class AbstractLayer<CnnLossLayer>maskArray - Mask array to setpublic boolean isPretrainLayer()
LayerisPretrainLayer in interface Layerpublic Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
LayerfeedForwardMaskArray in interface LayerfeedForwardMaskArray in class AbstractLayer<CnnLossLayer>maskArray - Mask array to setcurrentMaskState - Current state of the mask - see MaskStateminibatchSize - Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)public boolean needsLabels()
IOutputLayerneedsLabels in interface IOutputLayerpublic double computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
IOutputLayercomputeScore in interface IOutputLayerfullNetRegTerm - Regularization score (l1/l2/weight decay) for the entire networktraining - whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
computeScoreForExamples in interface IOutputLayerfullNetRegTerm - Regularization score term for the entire network (or, 0.0 to not include regularization)Copyright © 2020. All rights reserved.