Any history the derived minimization function needs to do its updates.
Any history the derived minimization function needs to do its updates. typically an approximation to the second derivative/hessian matrix.
Given a direction, perform a line search to find a direction to descend.
Given a direction, perform a line search to find a direction to descend. At the moment, this just executes backtracking, so it does not fulfill the wolfe conditions.
the current state
The objective
The step direction
stepSize
Port of LBFGS to Scala.
Special note for LBFGS: If you use it in published work, you must cite one of: * J. Nocedal. Updating Quasi-Newton Matrices with Limited Storage (1980), Mathematics of Computation 35, pp. 773-782. * D.C. Liu and J. Nocedal. On the Limited mem Method for Large Scale Optimization (1989), Mathematical Programming B, 45, 3, pp. 503-528.