Package ai.djl.training.optimizer
Class Nag
java.lang.Object
ai.djl.training.optimizer.Optimizer
ai.djl.training.optimizer.Nag
Nag
is a Nesterov accelerated gradient optimizer.
This optimizer updates each weight by:
\( state = momentum * state + grad + wd *weight\)
\( weight = weight - (lr * (grad + momentum * state))
-
Nested Class Summary
Nested classes/interfaces inherited from class ai.djl.training.optimizer.Optimizer
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
-
Field Summary
Fields inherited from class ai.djl.training.optimizer.Optimizer
clipGrad, rescaleGrad
-
Constructor Summary
ModifierConstructorDescriptionprotected
Nag
(Nag.Builder builder) Creates a new instance ofNag
optimizer. -
Method Summary
Methods inherited from class ai.djl.training.optimizer.Optimizer
adadelta, adagrad, adam, adamW, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
-
Constructor Details
-
Nag
Creates a new instance ofNag
optimizer.- Parameters:
builder
- the builder to create a new instance ofNag
optimizer
-
-
Method Details