public class Nag extends Optimizer
Nag
is a Nesterov accelerated gradient optimizer.
This optimizer updates each weight by:
\( state = momentum * state + grad + wd *weight\)
\( weight = weight - (lr * (grad + momentum * state))
Modifier and Type | Class and Description |
---|---|
static class |
Nag.Builder
The Builder to construct an
Nag object. |
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
clipGrad, rescaleGrad
Modifier | Constructor and Description |
---|---|
protected |
Nag(Nag.Builder builder)
Creates a new instance of
Nag optimizer. |
Modifier and Type | Method and Description |
---|---|
void |
update(java.lang.String parameterId,
NDArray weight,
NDArray grad)
Updates the parameters according to the gradients.
|
adadelta, adagrad, adam, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
protected Nag(Nag.Builder builder)
Nag
optimizer.builder
- the builder to create a new instance of Nag
optimizer