Class CenterLossOutputLayer
- java.lang.Object
-
- org.deeplearning4j.nn.layers.AbstractLayer<LayerConfT>
-
- org.deeplearning4j.nn.layers.BaseLayer<LayerConfT>
-
- org.deeplearning4j.nn.layers.BaseOutputLayer<CenterLossOutputLayer>
-
- org.deeplearning4j.nn.layers.training.CenterLossOutputLayer
-
- All Implemented Interfaces:
Serializable
,Cloneable
,Classifier
,Layer
,IOutputLayer
,Model
,Trainable
public class CenterLossOutputLayer extends BaseOutputLayer<CenterLossOutputLayer>
- See Also:
- Serialized Form
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from interface org.deeplearning4j.nn.api.Layer
Layer.TrainingMode, Layer.Type
-
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.layers.BaseOutputLayer
inputMaskArray, inputMaskArrayState, labels
-
Fields inherited from class org.deeplearning4j.nn.layers.BaseLayer
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, weightNoiseParams
-
Fields inherited from class org.deeplearning4j.nn.layers.AbstractLayer
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
-
-
Constructor Summary
Constructors Constructor Description CenterLossOutputLayer(NeuralNetConfiguration conf, DataType dataType)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description Pair<Gradient,INDArray>
backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layervoid
computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Update the scoredouble
computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.INDArray
computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.protected INDArray
getLabels2d(LayerWorkspaceMgr workspaceMgr, ArrayType arrayType)
Gradient
gradient()
Gets the gradient from one training iterationPair<Gradient,Double>
gradientAndScore()
Get the gradient and scoreprotected void
setScoreWithZ(INDArray z)
-
Methods inherited from class org.deeplearning4j.nn.layers.BaseOutputLayer
activate, applyMask, clear, f1Score, f1Score, fit, fit, fit, fit, fit, getLabels, hasBias, isPretrainLayer, needsLabels, numLabels, predict, predict, preOutput2d, setLabels
-
Methods inherited from class org.deeplearning4j.nn.layers.BaseLayer
activate, calcRegularizationScore, clearNoiseWeightParams, clone, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, hasLayerNorm, layerConf, numParams, params, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, update, update
-
Methods inherited from class org.deeplearning4j.nn.layers.AbstractLayer
addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, feedForwardMaskArray, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, updaterDivideByMinibatch
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.Layer
activate, allowInputModification, calcRegularizationScore, clearNoiseWeightParams, feedForwardMaskArray, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, type
-
Methods inherited from interface org.deeplearning4j.nn.api.Model
addListeners, applyConstraints, batchSize, close, conf, fit, getGradientsViewArray, getOptimizer, getParam, init, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update
-
Methods inherited from interface org.deeplearning4j.nn.api.Trainable
getConfig, getGradientsViewArray, numParams, params, paramTable, updaterDivideByMinibatch
-
-
-
-
Constructor Detail
-
CenterLossOutputLayer
public CenterLossOutputLayer(NeuralNetConfiguration conf, DataType dataType)
-
-
Method Detail
-
computeScore
public double computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.- Specified by:
computeScore
in interfaceIOutputLayer
- Overrides:
computeScore
in classBaseOutputLayer<CenterLossOutputLayer>
- Parameters:
fullNetRegTerm
- Regularization score term for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of dropout, etc)- Returns:
- score (loss function)
-
computeScoreForExamples
public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.- Specified by:
computeScoreForExamples
in interfaceIOutputLayer
- Overrides:
computeScoreForExamples
in classBaseOutputLayer<CenterLossOutputLayer>
- Parameters:
fullNetRegTerm
- Regularization term for the entire network (or, 0.0 to not include regularization)- Returns:
- A column INDArray of shape [numExamples,1], where entry i is the score of the ith example
-
computeGradientAndScore
public void computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Description copied from interface:Model
Update the score- Specified by:
computeGradientAndScore
in interfaceModel
- Overrides:
computeGradientAndScore
in classBaseOutputLayer<CenterLossOutputLayer>
-
setScoreWithZ
protected void setScoreWithZ(INDArray z)
- Overrides:
setScoreWithZ
in classBaseOutputLayer<CenterLossOutputLayer>
-
gradientAndScore
public Pair<Gradient,Double> gradientAndScore()
Description copied from interface:Model
Get the gradient and score- Specified by:
gradientAndScore
in interfaceModel
- Overrides:
gradientAndScore
in classBaseOutputLayer<CenterLossOutputLayer>
- Returns:
- the gradient and score
-
backpropGradient
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:Layer
Calculate the gradient relative to the error in the next layer- Specified by:
backpropGradient
in interfaceLayer
- Overrides:
backpropGradient
in classBaseOutputLayer<CenterLossOutputLayer>
- Parameters:
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C is cost function a=sigma(z) is activation.workspaceMgr
- Workspace manager- Returns:
- Pair
where Gradient is gradient for this layer, INDArray is epsilon (activation gradient) needed by next layer, but before element-wise multiply by sigmaPrime(z). So for standard feed-forward layer, if this layer is L, then return.getSecond() == dL/dIn = (w^(L)*(delta^(L))^T)^T. Note that the returned array should be placed in the ArrayType.ACTIVATION_GRAD
workspace via the workspace manager
-
gradient
public Gradient gradient()
Gets the gradient from one training iteration- Specified by:
gradient
in interfaceModel
- Overrides:
gradient
in classBaseOutputLayer<CenterLossOutputLayer>
- Returns:
- the gradient (bias and weight matrix)
-
getLabels2d
protected INDArray getLabels2d(LayerWorkspaceMgr workspaceMgr, ArrayType arrayType)
- Specified by:
getLabels2d
in classBaseOutputLayer<CenterLossOutputLayer>
-
-