Package ai.djl.training.optimizer
package ai.djl.training.optimizer
Contains classes for optimizing a neural network
Block
.
It contains a main interface Optimizer
and various
optimizers that extend it. There are also the helpers for learning rates in ai.djl.training.tracker
.
-
ClassesClassDescription
Adadelta
is an AdadeltaOptimizer
.The Builder to construct anAdadelta
object.Adagrad
is an AdaGradOptimizer
.The Builder to construct anAdagrad
object.Adam
is a generalization of the AdaGradOptimizer
.The Builder to construct anAdam
object.Adam
is a generalization of the AdaGradOptimizer
.The Builder to construct anAdamW
object.Nag
is a Nesterov accelerated gradient optimizer.The Builder to construct anNag
object.AnOptimizer
updates the weight parameters to minimize the loss function.The Builder to construct anOptimizer
.TheRMSProp
Optimizer
.The Builder to construct anRmsProp
object.Sgd
is a Stochastic Gradient Descent (SGD) optimizer.The Builder to construct anSgd
object.