Package ai.djl.training.optimizer
Class Optimizer
java.lang.Object
ai.djl.training.optimizer.Optimizer
An
Optimizer
updates the weight parameters to minimize the loss function.
Optimizer
is an abstract class that provides the base implementation for optimizers.-
Nested Class Summary
-
Field Summary
-
Constructor Summary
ConstructorDescriptionOptimizer
(Optimizer.OptimizerBuilder<?> builder) Creates a new instance ofOptimizer
. -
Method Summary
Modifier and TypeMethodDescriptionstatic Adadelta.Builder
adadelta()
Returns a new instance ofAdadelta.Builder
that can build anAdadelta
optimizer.static Adagrad.Builder
adagrad()
Returns a new instance ofAdagrad.Builder
that can build anAdagrad
optimizer.static Adam.Builder
adam()
Returns a new instance ofAdam.Builder
that can build anAdam
optimizer.static AdamW.Builder
adamW()
Returns a new instance ofAdamW.Builder
that can build anAdamW
optimizer.protected float
Gets the value of weight decay.static Nag.Builder
nag()
Returns a new instance ofNag.Builder
that can build anNag
optimizer.static RmsProp.Builder
rmsprop()
Returns a new instance ofRmsProp.Builder
that can build anRmsProp
optimizer.static Sgd.Builder
sgd()
Returns a new instance ofSgd.Builder
that can build anSgd
optimizer.abstract void
Updates the parameters according to the gradients.protected int
updateCount
(String parameterId) protected NDArray
withDefaultState
(Map<String, Map<Device, NDArray>> state, String key, Device device, Function<String, NDArray> defaultFunction)
-
Field Details
-
rescaleGrad
protected float rescaleGrad -
clipGrad
protected float clipGrad
-
-
Constructor Details
-
Optimizer
Creates a new instance ofOptimizer
.- Parameters:
builder
- the builder used to create an instance ofOptimizer
-
-
Method Details
-
sgd
Returns a new instance ofSgd.Builder
that can build anSgd
optimizer.- Returns:
- the
Sgd
Sgd.Builder
-
nag
Returns a new instance ofNag.Builder
that can build anNag
optimizer.- Returns:
- the
Nag
Nag.Builder
-
adam
Returns a new instance ofAdam.Builder
that can build anAdam
optimizer.- Returns:
- the
Adam
Adam.Builder
-
adamW
Returns a new instance ofAdamW.Builder
that can build anAdamW
optimizer.- Returns:
- the
AdamW
AdamW.Builder
-
rmsprop
Returns a new instance ofRmsProp.Builder
that can build anRmsProp
optimizer.- Returns:
- the
RmsProp
RmsProp.Builder
-
adagrad
Returns a new instance ofAdagrad.Builder
that can build anAdagrad
optimizer.- Returns:
- the
Adagrad
Adagrad.Builder
-
adadelta
Returns a new instance ofAdadelta.Builder
that can build anAdadelta
optimizer.- Returns:
- the
Adadelta
Adadelta.Builder
-
getWeightDecay
protected float getWeightDecay()Gets the value of weight decay.- Returns:
- the value of weight decay
-
updateCount
-
update
Updates the parameters according to the gradients.- Parameters:
parameterId
- the parameter to be updatedweight
- the weights of the parametergrad
- the gradients
-
withDefaultState
-