OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.
OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.
Configurations: 1) useStochastic=false,useL1=false: LBFGS with L2 regularization 2) useStochastic=false,useL1=true: OWLQN with L1 regularization 3) useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization 3) useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization
size of batches to use if useStochastic and you give a BatchDiffFunction
regularization constant to use.
rate of change to use, only applies to SGD.
if true, use L1 regularization. Otherwise, use L2.
convergence tolerance, looking at both average improvement and the norm of the gradient.
if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.
Tracks the information about the optimizer, including the current point, its value, gradient, and then any history.
Tracks the information about the optimizer, including the current point, its value, gradient, and then any history. Also includes information for checking convergence.
the current point being considered
f(x)
f.gradientAt(x)
f(x) + r(x), where r is any regularization added to the objective. For LBFGS, this is f(x).
f'(x) + r'(x), where r is any regularization added to the objective. For LBFGS, this is f'(x).
what iteration number we are on.
f(x_0) + r(x_0), used for checking convergence
any information needed by the optimizer to do updates.
did the line search fail?
Runs the function, and if it fails to decreased by at least improvementRequirement numFailures times in a row, then we abort
Runs the function, and if it fails to decreased by at least improvementRequirement numFailures times in a row, then we abort
how often we run the evaluation