Package ai.djl.training.evaluator
Class Coverage
java.lang.Object
ai.djl.training.evaluator.Evaluator
ai.djl.training.evaluator.AbstractAccuracy
ai.djl.training.evaluator.Coverage
Coverage for a Regression problem: it measures the percent of predictions greater than the actual
target, to determine whether the predictor is over-forecasting or under-forecasting. e.g. 0.50 if
we predict near the median of the distribution.
def coverage(target, forecast): return (np.mean((target < forecast)))...
-
Field Summary
Fields inherited from class ai.djl.training.evaluator.AbstractAccuracy
axis, correctInstances
Fields inherited from class ai.djl.training.evaluator.Evaluator
totalInstances
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionaccuracyHelper
(NDList labels, NDList predictions) A helper for classes extendingAbstractAccuracy
.Methods inherited from class ai.djl.training.evaluator.AbstractAccuracy
addAccumulator, evaluate, getAccumulator, resetAccumulator, updateAccumulator, updateAccumulators
Methods inherited from class ai.djl.training.evaluator.Evaluator
checkLabelShapes, checkLabelShapes, getName
-
Constructor Details
-
Coverage
public Coverage()Creates an evaluator that measures the percent of predictions greater than the actual target. -
Coverage
Creates an evaluator that measures the percent of predictions greater than the actual target.- Parameters:
name
- the name of the evaluator, default is "Coverage"axis
- the axis along which to count the correct prediction, default is 1
-
-
Method Details
-
accuracyHelper
A helper for classes extendingAbstractAccuracy
.- Specified by:
accuracyHelper
in classAbstractAccuracy
- Parameters:
labels
- the labels to get accuracy forpredictions
- the predictions to get accuracy for- Returns:
- a pair(number of total values, ndarray int of correct values)
-