public class Adam extends Optimizer
Adam
is a generalization of the AdaGrad Optimizer
.
Adam updates the weights using:
\( m = beta1 * m + (1 - beta1) * grad\)
\( v = beta2 * v + (1 - beta2) * grad^2 \)
\( w -= learning_rate * m / (sqrt(v) + epsilon) \)
where g represents the gradient, and m/v are 1st and 2nd order moment estimates (mean and
variance).
Modifier and Type | Class and Description |
---|---|
static class |
Adam.Builder
The Builder to construct an
Adam object. |
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
clipGrad, rescaleGrad
Modifier | Constructor and Description |
---|---|
protected |
Adam(Adam.Builder builder)
Creates a new instance of
Adam optimizer. |
Modifier and Type | Method and Description |
---|---|
static Adam.Builder |
builder()
Creates a builder to build a
Adam . |
void |
update(java.lang.String parameterId,
NDArray weight,
NDArray grad)
Updates the parameters according to the gradients.
|
adadelta, adagrad, adam, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
protected Adam(Adam.Builder builder)
Adam
optimizer.builder
- the builder to create a new instance of Adam
optimizerpublic void update(java.lang.String parameterId, NDArray weight, NDArray grad)
public static Adam.Builder builder()
Adam
.