breeze.optimize.FirstOrderMinimizer
size of batches to use if useStochastic and you give a BatchDiffFunction
regularization constant to use.
rate of change to use, only applies to SGD.
if true, use L1 regularization. Otherwise, use L2.
convergence tolerance, looking at both average improvement and the norm of the gradient.
if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.
rate of change to use, only applies to SGD.
size of batches to use if useStochastic and you give a BatchDiffFunction
regularization constant to use.
convergence tolerance, looking at both average improvement and the norm of the gradient.
if true, use L1 regularization.
if true, use L1 regularization. Otherwise, use L2.
if false, use LBFGS or OWLQN.
if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.
OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.
Configurations: 1) useStochastic=false,useL1=false: LBFGS with L2 regularization 2) useStochastic=false,useL1=true: OWLQN with L1 regularization 3) useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization 3) useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization
size of batches to use if useStochastic and you give a BatchDiffFunction
regularization constant to use.
rate of change to use, only applies to SGD.
if true, use L1 regularization. Otherwise, use L2.
convergence tolerance, looking at both average improvement and the norm of the gradient.
if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.