Package

breeze

optimize

Permalink

package optimize

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. optimize
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class AdaDeltaGradientDescent[T] extends StochasticGradientDescent[T]

    Permalink

    Created by jda on 3/17/15.

  2. class ApproximateGradientFunction[K, T] extends DiffFunction[T]

    Permalink

    Approximates a gradient by finite differences.

  3. trait ApproximateLineSearch extends MinimizingLineSearch

    Permalink

    A line search optimizes a function of one variable without analytic gradient information.

    A line search optimizes a function of one variable without analytic gradient information. It's often used approximately (e.g. in backtracking line search), where there is no intrinsic termination criterion, only extrinsic

  4. class BacktrackingLineSearch extends ApproximateLineSearch

    Permalink

    Implements the Backtracking Linesearch like that in LBFGS-C (which is (c) 2007-2010 Naoaki Okazaki under BSD)

    Implements the Backtracking Linesearch like that in LBFGS-C (which is (c) 2007-2010 Naoaki Okazaki under BSD)

    Basic idea is that we need to find an alpha that is sufficiently smaller than f(0), and also possibly requiring that the slope of f decrease by the right amount (wolfe conditions)

  5. trait BatchDiffFunction[T] extends DiffFunction[T] with (T, IndexedSeq[Int]) ⇒ Double

    Permalink

    A diff function that supports subsets of the data.

    A diff function that supports subsets of the data. By default it evaluates on all the data

  6. case class BatchSize(size: Int) extends OptimizationOption with Product with Serializable

    Permalink
  7. class CachedBatchDiffFunction[T] extends BatchDiffFunction[T]

    Permalink

  8. class CachedDiffFunction[T] extends DiffFunction[T]

    Permalink

  9. class CompactHessian extends NumericOps[CompactHessian]

    Permalink
  10. abstract class CubicLineSearch extends SerializableLogging with MinimizingLineSearch

    Permalink
  11. trait DiffFunction[T] extends StochasticDiffFunction[T] with NumericOps[DiffFunction[T]]

    Permalink

    Represents a differentiable function whose output is guaranteed to be consistent

  12. sealed trait DiffFunctionOpImplicits extends AnyRef

    Permalink
  13. class EmpiricalHessian[T] extends AnyRef

    Permalink

    The empirical hessian evaluates the derivative for multiplcation.

    The empirical hessian evaluates the derivative for multiplcation.

    H * d = \lim_e -> 0 (f'(x + e * d) - f'(x))/e

  14. sealed class FirstOrderException extends RuntimeException

    Permalink
  15. abstract class FirstOrderMinimizer[T, DF <: StochasticDiffFunction[T]] extends Minimizer[T, DF] with SerializableLogging

    Permalink

  16. class FisherDiffFunction[T] extends SecondOrderFunction[T, FisherMatrix[T]]

    Permalink
  17. class FisherMatrix[T] extends AnyRef

    Permalink

    The Fisher matrix approximates the Hessian by E[grad grad'].

    The Fisher matrix approximates the Hessian by E[grad grad']. We further approximate this with a monte carlo approximation to the expectation.

  18. trait IterableOptimizationPackage[Function, Vector, State] extends OptimizationPackage[Function, Vector]

    Permalink
  19. case class L1Regularization(value: Double = 1.0) extends OptimizationOption with Product with Serializable

    Permalink
  20. case class L2Regularization(value: Double = 1.0) extends OptimizationOption with Product with Serializable

    Permalink
  21. class LBFGS[T] extends FirstOrderMinimizer[T, DiffFunction[T]] with SerializableLogging

    Permalink

    Port of LBFGS to Scala.

    Port of LBFGS to Scala.

    Special note for LBFGS: If you use it in published work, you must cite one of: * J. Nocedal. Updating Quasi-Newton Matrices with Limited Storage (1980), Mathematics of Computation 35, pp. 773-782. * D.C. Liu and J. Nocedal. On the Limited mem Method for Large Scale Optimization (1989), Mathematical Programming B, 45, 3, pp. 503-528.

  22. class LBFGSB extends FirstOrderMinimizer[DenseVector[Double], DiffFunction[DenseVector[Double]]] with SerializableLogging

    Permalink

    This algorithm is refered the paper "A LIMITED MEMOR Y ALGORITHM F OR BOUND CONSTRAINED OPTIMIZA TION" written by Richard H.Byrd Peihuang Lu Jorge Nocedal and Ciyou Zhu Created by fanming.chen on 2015/3/7 0007.

    This algorithm is refered the paper "A LIMITED MEMOR Y ALGORITHM F OR BOUND CONSTRAINED OPTIMIZA TION" written by Richard H.Byrd Peihuang Lu Jorge Nocedal and Ciyou Zhu Created by fanming.chen on 2015/3/7 0007. If StrongWolfeLineSearch(maxZoomIter,maxLineSearchIter) is small, the wolfeRuleSearch.minimize may throw FirstOrderException, it should increase the two variables to appropriate value

  23. trait LineSearch extends ApproximateLineSearch

    Permalink

    A line search optimizes a function of one variable without analytic gradient information.

    A line search optimizes a function of one variable without analytic gradient information. Differs only in whether or not it tries to find an exact minimizer

  24. class LineSearchFailed extends FirstOrderException

    Permalink
  25. case class MaxIterations(num: Int) extends OptimizationOption with Product with Serializable

    Permalink
  26. trait Minimizer[T, -F] extends AnyRef

    Permalink

    Anything that can minimize a function

  27. trait MinimizingLineSearch extends AnyRef

    Permalink
  28. class NaNHistory extends FirstOrderException

    Permalink
  29. class OWLQN[K, T] extends LBFGS[T] with SerializableLogging

    Permalink

    Implements the Orthant-wise Limited Memory QuasiNewton method, which is a variant of LBFGS that handles L1 regularization.

    Implements the Orthant-wise Limited Memory QuasiNewton method, which is a variant of LBFGS that handles L1 regularization.

    Paper is Andrew and Gao (2007) Scalable Training of L1-Regularized Log-Linear Models

  30. sealed trait OptimizationOption extends (OptParams) ⇒ OptParams

    Permalink

  31. trait OptimizationPackage[Function, Vector] extends AnyRef

    Permalink

  32. sealed trait OptimizationPackageLowPriority extends OptimizationPackageLowPriority2

    Permalink
  33. sealed trait OptimizationPackageLowPriority2 extends AnyRef

    Permalink
  34. class ProjectedQuasiNewton extends FirstOrderMinimizer[DenseVector[Double], DiffFunction[DenseVector[Double]]] with Projecting[DenseVector[Double]] with SerializableLogging

    Permalink
  35. trait Projecting[T] extends AnyRef

    Permalink
  36. trait SecondOrderFunction[T, H] extends DiffFunction[T]

    Permalink

    Represents a function for which we can easily compute the Hessian.

    Represents a function for which we can easily compute the Hessian.

    For conjugate gradient methods, you can play tricks with the hessian, returning an object that only supports multiplication.

  37. class SpectralProjectedGradient[T] extends FirstOrderMinimizer[T, DiffFunction[T]] with Projecting[T] with SerializableLogging

    Permalink

    SPG is a Spectral Projected Gradient minimizer; it minimizes a differentiable function subject to the optimum being in some set, given by the projection operator projection

    SPG is a Spectral Projected Gradient minimizer; it minimizes a differentiable function subject to the optimum being in some set, given by the projection operator projection

    T

    vector type

  38. class StepSizeOverflow extends FirstOrderException

    Permalink
  39. case class StepSizeScale(alpha: Double = 1.0) extends OptimizationOption with Product with Serializable

    Permalink
  40. class StepSizeUnderflow extends FirstOrderException

    Permalink
  41. class StochasticAveragedGradient[T] extends FirstOrderMinimizer[T, BatchDiffFunction[T]]

    Permalink

  42. trait StochasticDiffFunction[T] extends (T) ⇒ Double with NumericOps[StochasticDiffFunction[T]]

    Permalink

    A differentiable function whose output is not guaranteed to be the same across consecutive invocations.

  43. abstract class StochasticGradientDescent[T] extends FirstOrderMinimizer[T, StochasticDiffFunction[T]] with SerializableLogging

    Permalink

    Minimizes a function using stochastic gradient descent

  44. class StrongWolfeLineSearch extends CubicLineSearch

    Permalink
  45. case class Tolerance(fvalTolerance: Double = 1E-5, gvalTolerance: Double = 1e-6) extends OptimizationOption with Product with Serializable

    Permalink
  46. class TruncatedNewtonMinimizer[T, H] extends Minimizer[T, SecondOrderFunction[T, H]] with SerializableLogging

    Permalink

    Implements a TruncatedNewton Trust region method (like Tron).

    Implements a TruncatedNewton Trust region method (like Tron). Also implements "Hessian Free learning". We have a few extra tricks though... :)

Value Members

  1. object AdaptiveGradientDescent

    Permalink

    Implements the L2^2 and L1 updates from Duchi et al 2010 Adaptive Subgradient Methods for Online Learning and Stochastic Optimization.

    Implements the L2^2 and L1 updates from Duchi et al 2010 Adaptive Subgradient Methods for Online Learning and Stochastic Optimization.

    Basically, we use "forward regularization" and an adaptive step size based on the previous gradients.

  2. object BatchDiffFunction

    Permalink
  3. object DiffFunction extends DiffFunctionOpImplicits

    Permalink
  4. object EmpiricalHessian

    Permalink
  5. object FirstOrderMinimizer extends Serializable

    Permalink
  6. object FisherMatrix

    Permalink
  7. object GradientTester extends SerializableLogging

    Permalink

    Class that compares the computed gradient with an empirical gradient based on finite differences.

    Class that compares the computed gradient with an empirical gradient based on finite differences. Essential for debugging dynamic programs.

  8. object LBFGS extends Serializable

    Permalink
  9. object LBFGSB extends Serializable

    Permalink
  10. object LineSearch

    Permalink
  11. object OptimizationOption

    Permalink
  12. object OptimizationPackage extends OptimizationPackageLowPriority

    Permalink
  13. object PreferBatch extends OptimizationOption with Product with Serializable

    Permalink
  14. object PreferOnline extends OptimizationOption with Product with Serializable

    Permalink
  15. object ProjectedQuasiNewton extends SerializableLogging

    Permalink
  16. object RootFinding

    Permalink

    Root finding algorithms

  17. object SecondOrderFunction

    Permalink
  18. object StochasticGradientDescent extends Serializable

    Permalink
  19. package flow

    Permalink
  20. def iterations[Objective, Vector, State](fn: Objective, init: Vector, options: OptimizationOption*)(implicit optimization: IterableOptimizationPackage[Objective, Vector, State]): Iterator[State]

    Permalink

    Returns a sequence of states representing the iterates of a solver, given an breeze.optimize.IterableOptimizationPackage that knows how to minimize The actual state class varies with the kind of function passed in.

    Returns a sequence of states representing the iterates of a solver, given an breeze.optimize.IterableOptimizationPackage that knows how to minimize The actual state class varies with the kind of function passed in. Typically, they have a .x value of type Vector that is the current point being evaluated, and .value is the current objective value

  21. package linear

    Permalink
  22. def minimize[Objective, Vector](fn: Objective, init: Vector, options: OptimizationOption*)(implicit optimization: OptimizationPackage[Objective, Vector]): Vector

    Permalink

    Minimizes a function, given an breeze.optimize.OptimizationPackage that knows how to minimize

  23. package proximal

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped