public class Adagrad extends Optimizer
Adagrad
is an AdaGrad Optimizer
.
This class implements
Adagrad updates the weights using:
\( grad = clip(grad * resc_grad, clip_grad) + wd * weight \)
\( history += grad^2 \)
\( weight -= lr * grad / (sqrt(history) + epsilon) \)
where grad represents the gradient, wd represents weight decay, and lr represents learning rate.
Modifier and Type | Class and Description |
---|---|
static class |
Adagrad.Builder
The Builder to construct an
Adagrad object. |
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
clipGrad, rescaleGrad
Modifier | Constructor and Description |
---|---|
protected |
Adagrad(Adagrad.Builder builder)
Creates a new instance of
Adam optimizer. |
Modifier and Type | Method and Description |
---|---|
static Adagrad.Builder |
builder()
Creates a builder to build a
Adam . |
void |
update(java.lang.String parameterId,
NDArray weight,
NDArray grad)
Updates the parameters according to the gradients.
|
adadelta, adagrad, adam, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
protected Adagrad(Adagrad.Builder builder)
Adam
optimizer.builder
- the builder to create a new instance of Adam
optimizerpublic void update(java.lang.String parameterId, NDArray weight, NDArray grad)
public static Adagrad.Builder builder()
Adam
.