Class Sgd


public class Sgd extends Optimizer
Sgd is a Stochastic Gradient Descent (SGD) optimizer.

If momentum is not set, it updates weights using the following update function:
\( weight = weight - learning_rate * (gradient + wd * weight) \).

If momentum is set, it updates weights using the following update function:
\( state = momentum * state + learning_rate * gradient \)
\( weight -= state \)
Momentum update has better convergence rates on neural networks.

See Also:
  • Constructor Details

    • Sgd

      protected Sgd(Sgd.Builder builder)
      Creates a new instance of Sgd.
      Parameters:
      builder - the builder to create a new instance of Sgd
  • Method Details

    • update

      public void update(String parameterId, NDArray weight, NDArray grad)
      Updates the parameters according to the gradients.
      Specified by:
      update in class Optimizer
      Parameters:
      parameterId - the parameter to be updated
      weight - the weights of the parameter
      grad - the gradients