Class Adagrad

java.lang.Object
ai.djl.training.optimizer.Optimizer
ai.djl.training.optimizer.Adagrad

public class Adagrad extends Optimizer
Adagrad is an AdaGrad Optimizer.

This class implements

Adagrad updates the weights using:

\( grad = clip(grad * resc_grad, clip_grad) + wd * weight \)
\( history += grad^2 \)
\( weight -= lr * grad / (sqrt(history) + epsilon) \)

where grad represents the gradient, wd represents weight decay, and lr represents learning rate.

See Also:
  • Constructor Details

    • Adagrad

      protected Adagrad(Adagrad.Builder builder)
      Creates a new instance of Adam optimizer.
      Parameters:
      builder - the builder to create a new instance of Adam optimizer
  • Method Details

    • update

      public void update(String parameterId, NDArray weight, NDArray grad)
      Updates the parameters according to the gradients.
      Specified by:
      update in class Optimizer
      Parameters:
      parameterId - the parameter to be updated
      weight - the weights of the parameter
      grad - the gradients
    • builder

      public static Adagrad.Builder builder()
      Creates a builder to build a Adam.
      Returns:
      a new builder