Class | Description |
---|---|
AbsoluteDifferenceLoss |
Absolute difference loss
|
BaseLoss | |
CosineDistanceLoss |
Cosine distance loss
|
HingeLoss |
Hinge loss
|
HuberLoss |
Huber loss
|
L2Loss |
L2 loss op wrapper
|
LogLoss |
Binary log loss, or cross entropy loss:
-1/numExamples * sum_i (labels[i] * log(predictions[i] + epsilon) + (1-labels[i]) * log(1-predictions[i] + epsilon)) |
LogPoissonLoss |
Log Poisson loss
Note: This expects that the input/predictions are log(x) not x!
|
MeanPairwiseSquaredErrorLoss |
Mean Pairwise Squared Error Loss
|
MeanSquaredErrorLoss |
Mean squared error loss
|
SigmoidCrossEntropyLoss |
Sigmoid cross entropy loss with logits
|
SoftmaxCrossEntropyLoss |
Softmax cross entropy loss
|
SoftmaxCrossEntropyWithLogitsLoss |
Softmax cross entropy loss with Logits
|
SparseSoftmaxCrossEntropyLossWithLogits |
Sparse softmax cross entropy loss with logits.
|
WeightedCrossEntropyLoss |
Weighted cross entropy loss with logits
|
Copyright © 2020. All rights reserved.