Class BaseHpOptimizer
- java.lang.Object
-
- ai.djl.training.hyperparameter.optimizer.BaseHpOptimizer
-
- All Implemented Interfaces:
HpOptimizer
- Direct Known Subclasses:
HpORandom
public abstract class BaseHpOptimizer extends java.lang.Object implements HpOptimizer
A base containing shared implementations forHpOptimizers.- See Also:
HpOptimizer
-
-
Field Summary
Fields Modifier and Type Field Description protected HpSethyperParamsprotected java.util.Map<HpSet,java.lang.Float>results
-
Constructor Summary
Constructors Constructor Description BaseHpOptimizer(HpSet hyperParams)Constructs aBaseHpOptimizer.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description ai.djl.util.Pair<HpSet,java.lang.Float>getBest()Returns the best hyperparameters and loss.floatgetLoss(HpSet config)Returns the recorded loss.voidupdate(HpSet config, float loss)Updates the optimizer with the results of a hyperparameter test.-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface ai.djl.training.hyperparameter.optimizer.HpOptimizer
nextConfig
-
-
-
-
Constructor Detail
-
BaseHpOptimizer
public BaseHpOptimizer(HpSet hyperParams)
Constructs aBaseHpOptimizer.- Parameters:
hyperParams- the set of hyperparameters
-
-
Method Detail
-
update
public void update(HpSet config, float loss)
Updates the optimizer with the results of a hyperparameter test.- Specified by:
updatein interfaceHpOptimizer- Parameters:
config- the tested hyperparametersloss- the validation loss from training with those hyperparameters
-
getLoss
public float getLoss(HpSet config)
Returns the recorded loss.- Specified by:
getLossin interfaceHpOptimizer- Parameters:
config- the hyperparameters that were trained with- Returns:
- the loss
-
getBest
public ai.djl.util.Pair<HpSet,java.lang.Float> getBest()
Returns the best hyperparameters and loss.- Specified by:
getBestin interfaceHpOptimizer- Returns:
- the best hyperparameters and loss
-
-