public class Yolo2OutputLayer extends AbstractLayer<Yolo2OutputLayer> implements Serializable, IOutputLayer
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected INDArray |
labels |
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
Yolo2OutputLayer(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
void |
clearNoiseWeightParams() |
Layer |
clone() |
void |
computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Update the score
|
double |
computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
INDArray |
computeScoreForExamples(double fullNetRegTerm,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction
wrt true labels
|
double |
f1Score(INDArray examples,
INDArray labels)
Returns the f1 score for the given examples.
|
void |
fit(DataSet data)
Fit the model
|
void |
fit(DataSetIterator iter)
Train the model based on the datasetiterator
|
void |
fit(INDArray examples,
INDArray labels)
Fit the model
|
void |
fit(INDArray examples,
int[] labels)
Fit the model
|
INDArray |
getConfidenceMatrix(INDArray networkOutput,
int example,
int bbNumber)
Get the confidence matrix (confidence for all x/y positions) for the specified bounding box, from the network
output activations array
|
List<DetectedObject> |
getPredictedObjects(INDArray networkOutput,
double threshold) |
INDArray |
getProbabilityMatrix(INDArray networkOutput,
int example,
int classNumber)
Get the probability matrix (probability of the specified class, assuming an object is present, for all x/y
positions), from the network output activations array
|
Pair<Gradient,Double> |
gradientAndScore()
Get the gradient and score
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
boolean |
needsLabels()
Returns true if labels are required
for this output layer
|
int |
numLabels()
Returns the number of possible labels
|
List<String> |
predict(DataSet dataSet)
Takes in a DataSet of examples
For each row, returns a label
|
int[] |
predict(INDArray examples)
Takes in a list of examples
For each row, returns a label
|
double |
score()
The score for the model
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, calcRegularizationScore, clear, close, conf, feedForwardMaskArray, fit, fit, getConfig, getEpochCount, getGradientsViewArray, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, getParam, gradient, init, input, layerConf, layerId, numParams, numParams, params, paramTable, paramTable, setBackpropGradientsViewArray, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, type, update, update, updaterDivideByMinibatchequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetLabels, setLabelsactivate, allowInputModification, calcRegularizationScore, feedForwardMaskArray, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, typegetConfig, getGradientsViewArray, numParams, params, paramTable, updaterDivideByMinibatchaddListeners, applyConstraints, batchSize, clear, close, conf, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, init, input, numParams, numParams, params, paramTable, paramTable, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, updateprotected INDArray labels
public Yolo2OutputLayer(NeuralNetConfiguration conf, DataType dataType)
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface Layerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layertraining - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic boolean needsLabels()
IOutputLayerneedsLabels in interface IOutputLayerpublic double computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
IOutputLayercomputeScore in interface IOutputLayerfullNetRegTerm - Regularization score (l1/l2/weight decay) for the entire networktraining - whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public double score()
Modelscore in interface Modelscore in class AbstractLayer<Yolo2OutputLayer>public void computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
ModelcomputeGradientAndScore in interface ModelcomputeGradientAndScore in class AbstractLayer<Yolo2OutputLayer>public Pair<Gradient,Double> gradientAndScore()
ModelgradientAndScore in interface ModelgradientAndScore in class AbstractLayer<Yolo2OutputLayer>public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
IOutputLayercomputeScoreForExamples in interface IOutputLayerfullNetRegTerm - Regularization score (l1/l2/weight decay) for the entire networkpublic double f1Score(DataSet data)
Classifierf1Score in interface Classifierdata - the data to scorepublic double f1Score(INDArray examples, INDArray labels)
Classifierf1Score in interface Classifierexamples - te the examples to classify (one example in each row)labels - the true labelspublic int numLabels()
ClassifiernumLabels in interface Classifierpublic void fit(DataSetIterator iter)
Classifierfit in interface Classifieriter - the iterator to train onpublic int[] predict(INDArray examples)
Classifierpredict in interface Classifierexamples - the examples to classify (one example in each row)public List<String> predict(DataSet dataSet)
Classifierpredict in interface ClassifierdataSet - the examples to classifypublic void fit(INDArray examples, INDArray labels)
Classifierfit in interface Classifierexamples - the examples to classify (one example in each row)labels - the example labels(a binary outcome matrix)public void fit(DataSet data)
Classifierfit in interface Classifierdata - the data to train onpublic void fit(INDArray examples, int[] labels)
Classifierfit in interface Classifierexamples - the examples to classify (one example in each row)labels - the labels for each example (the number of labels must match
the number of rows in the examplepublic boolean isPretrainLayer()
LayerisPretrainLayer in interface Layerpublic void clearNoiseWeightParams()
clearNoiseWeightParams in interface Layerpublic List<DetectedObject> getPredictedObjects(INDArray networkOutput, double threshold)
public INDArray getConfidenceMatrix(INDArray networkOutput, int example, int bbNumber)
networkOutput - Network output activationsexample - Example number, in minibatchbbNumber - Bounding box numberpublic INDArray getProbabilityMatrix(INDArray networkOutput, int example, int classNumber)
networkOutput - Network output activationsexample - Example number, in minibatchclassNumber - Class numberCopyright © 2020. All rights reserved.