public class ElasticNetWeightDecay extends Loss
ElasticWeightDecay
calculates L1+L2 penalty of a set of parameters. Used for
regularization.
L loss is defined as \(L = \lambda_1 \sum_i \vert W_i\vert + \lambda_2 \sum_i {W_i}^2\).
totalInstances
Constructor and Description |
---|
ElasticNetWeightDecay(NDList parameters)
Calculates Elastic Net weight decay for regularization.
|
ElasticNetWeightDecay(java.lang.String name,
NDList parameters)
Calculates Elastic Net weight decay for regularization.
|
ElasticNetWeightDecay(java.lang.String name,
NDList parameters,
float lambda)
Calculates Elastic Net weight decay for regularization.
|
ElasticNetWeightDecay(java.lang.String name,
NDList parameters,
float lambda1,
float lambda2)
Calculates Elastic Net weight decay for regularization.
|
Modifier and Type | Method and Description |
---|---|
NDArray |
evaluate(NDList label,
NDList prediction)
Calculates the evaluation between the labels and the predictions.
|
addAccumulator, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, getAccumulator, hingeLoss, hingeLoss, hingeLoss, l1Loss, l1Loss, l1Loss, l1WeightedDecay, l1WeightedDecay, l1WeightedDecay, l2Loss, l2Loss, l2Loss, l2WeightedDecay, l2WeightedDecay, l2WeightedDecay, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, resetAccumulator, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updateAccumulator
checkLabelShapes, checkLabelShapes, getName
public ElasticNetWeightDecay(NDList parameters)
parameters
- holds the model weights that will be penalizedpublic ElasticNetWeightDecay(java.lang.String name, NDList parameters)
name
- the name of the penaltyparameters
- holds the model weights that will be penalizedpublic ElasticNetWeightDecay(java.lang.String name, NDList parameters, float lambda)
name
- the name of the penaltyparameters
- holds the model weights that will be penalizedlambda
- the weight to apply to the penalty value, default 1 (both L1 and L2)public ElasticNetWeightDecay(java.lang.String name, NDList parameters, float lambda1, float lambda2)
name
- the name of the penaltyparameters
- holds the model weights that will be penalizedlambda1
- the weight to apply to the L1 penalty value, default 1lambda2
- the weight to apply to the L2 penalty value, default 1