Package ai.djl.training.optimizer
Contains classes for optimizing a neural network
Block
.
It contains a main interface Optimizer
and various
optimizers that extend it. There are also the helpers for learning rates in ai.djl.training.tracker
.
-
Class Summary Class Description Adadelta Adadelta
is an AdadeltaOptimizer
.Adadelta.Builder The Builder to construct anAdadelta
object.Adagrad Adagrad
is an AdaGradOptimizer
.Adagrad.Builder The Builder to construct anAdagrad
object.Adam Adam
is a generalization of the AdaGradOptimizer
.Adam.Builder The Builder to construct anAdam
object.AdamW Adam
is a generalization of the AdaGradOptimizer
.AdamW.Builder The Builder to construct anAdamW
object.Nag Nag
is a Nesterov accelerated gradient optimizer.Nag.Builder The Builder to construct anNag
object.Optimizer AnOptimizer
updates the weight parameters to minimize the loss function.Optimizer.OptimizerBuilder<T extends Optimizer.OptimizerBuilder> The Builder to construct anOptimizer
.RmsProp TheRMSProp
Optimizer
.RmsProp.Builder The Builder to construct anRmsProp
object.Sgd Sgd
is a Stochastic Gradient Descent (SGD) optimizer.Sgd.Builder The Builder to construct anSgd
object.