Package ai.djl.training.optimizer
Class Adagrad
java.lang.Object
ai.djl.training.optimizer.Optimizer
ai.djl.training.optimizer.Adagrad
Adagrad
is an AdaGrad Optimizer
.
This class implements
Adagrad updates the weights using:
\( grad = clip(grad * resc_grad, clip_grad) + wd * weight \)
\( history += grad^2 \)
\( weight -= lr * grad / (sqrt(history) + epsilon) \)
where grad represents the gradient, wd represents weight decay, and lr represents learning rate.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class ai.djl.training.optimizer.Optimizer
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
-
Field Summary
Fields inherited from class ai.djl.training.optimizer.Optimizer
clipGrad, rescaleGrad
-
Constructor Summary
ModifierConstructorDescriptionprotected
Adagrad
(Adagrad.Builder builder) Creates a new instance ofAdam
optimizer. -
Method Summary
Methods inherited from class ai.djl.training.optimizer.Optimizer
adadelta, adagrad, adam, adamW, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
-
Constructor Details
-
Adagrad
Creates a new instance ofAdam
optimizer.- Parameters:
builder
- the builder to create a new instance ofAdam
optimizer
-
-
Method Details