public class Sgd extends Optimizer
Sgd
is a Stochastic Gradient Descent (SGD) optimizer.
If momentum is not set, it updates weights using the following update function:
\( weight = weight - learning_rate * (gradient + wd * weight) \).
If momentum is set, it updates weights using the following update function:
\( state = momentum * state + learning_rate * gradient \)
\( weight -= state \)
Momentum update has better convergence rates on neural networks.
Modifier and Type | Class and Description |
---|---|
static class |
Sgd.Builder
The Builder to construct an
Sgd object. |
Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder>
clipGrad, rescaleGrad
Modifier | Constructor and Description |
---|---|
protected |
Sgd(Sgd.Builder builder)
Creates a new instance of
Sgd . |
Modifier and Type | Method and Description |
---|---|
void |
update(java.lang.String parameterId,
NDArray weight,
NDArray grad)
Updates the parameters according to the gradients.
|
adadelta, adagrad, adam, getWeightDecay, nag, rmsprop, sgd, updateCount, withDefaultState
protected Sgd(Sgd.Builder builder)
Sgd
.builder
- the builder to create a new instance of Sgd