Packages

p

breeze

optimize

package optimize

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. optimize
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Package Members

  1. package flow
  2. package linear
  3. package proximal

Type Members

  1. class AdaDeltaGradientDescent[T] extends StochasticGradientDescent[T]

    Created by jda on 3/17/15.

  2. class ApproximateGradientFunction[K, T] extends DiffFunction[T]

    Approximates a gradient by finite differences.

  3. trait ApproximateLineSearch extends MinimizingLineSearch

    A line search optimizes a function of one variable without analytic gradient information.

    A line search optimizes a function of one variable without analytic gradient information. It's often used approximately (e.g. in backtracking line search), where there is no intrinsic termination criterion, only extrinsic

  4. class BacktrackingLineSearch extends ApproximateLineSearch

    Implements the Backtracking Linesearch like that in LBFGS-C (which is (c) 2007-2010 Naoaki Okazaki under BSD)

    Implements the Backtracking Linesearch like that in LBFGS-C (which is (c) 2007-2010 Naoaki Okazaki under BSD)

    Basic idea is that we need to find an alpha that is sufficiently smaller than f(0), and also possibly requiring that the slope of f decrease by the right amount (wolfe conditions)

  5. trait BatchDiffFunction[T] extends DiffFunction[T] with (T, IndexedSeq[Int]) => Double

    A diff function that supports subsets of the data.

    A diff function that supports subsets of the data. By default it evaluates on all the data

  6. case class BatchSize(size: Int) extends OptimizationOption with Product with Serializable
  7. class CachedBatchDiffFunction[T] extends BatchDiffFunction[T]

  8. class CachedDiffFunction[T] extends DiffFunction[T]

  9. class CompactHessian extends NumericOps[CompactHessian]
  10. abstract class CubicLineSearch extends SerializableLogging with MinimizingLineSearch
  11. trait DiffFunction[T] extends StochasticDiffFunction[T] with NumericOps[DiffFunction[T]]

    Represents a differentiable function whose output is guaranteed to be consistent

  12. sealed trait DiffFunctionOpImplicits extends AnyRef
  13. class EmpiricalHessian[T] extends AnyRef

    The empirical hessian evaluates the derivative for multiplcation.

    The empirical hessian evaluates the derivative for multiplcation.

    H * d = \lim_e -> 0 (f'(x + e * d) - f'(x))/e

  14. sealed class FirstOrderException extends RuntimeException
  15. abstract class FirstOrderMinimizer[T, DF <: StochasticDiffFunction[T]] extends Minimizer[T, DF] with SerializableLogging

  16. class FisherDiffFunction[T] extends SecondOrderFunction[T, FisherMatrix[T]]
  17. class FisherMatrix[T] extends AnyRef

    The Fisher matrix approximates the Hessian by E[grad grad'].

    The Fisher matrix approximates the Hessian by E[grad grad']. We further approximate this with a monte carlo approximation to the expectation.

  18. trait IterableOptimizationPackage[Function, Vector, State] extends OptimizationPackage[Function, Vector]
  19. case class L1Regularization(value: Double = 1.0) extends OptimizationOption with Product with Serializable
  20. case class L2Regularization(value: Double = 1.0) extends OptimizationOption with Product with Serializable
  21. class LBFGS[T] extends FirstOrderMinimizer[T, DiffFunction[T]] with SerializableLogging

    Port of LBFGS to Scala.

    Port of LBFGS to Scala.

    Special note for LBFGS: If you use it in published work, you must cite one of: * J. Nocedal. Updating Quasi-Newton Matrices with Limited Storage (1980), Mathematics of Computation 35, pp. 773-782. * D.C. Liu and J. Nocedal. On the Limited mem Method for Large Scale Optimization (1989), Mathematical Programming B, 45, 3, pp. 503-528.

  22. class LBFGSB extends FirstOrderMinimizer[DenseVector[Double], DiffFunction[DenseVector[Double]]] with SerializableLogging

    This algorithm is refered the paper "A LIMITED MEMOR Y ALGORITHM F OR BOUND CONSTRAINED OPTIMIZA TION" written by Richard H.Byrd   Peihuang Lu   Jorge Nocedal  and Ciyou Zhu Created by fanming.chen on 2015/3/7 0007.

    This algorithm is refered the paper "A LIMITED MEMOR Y ALGORITHM F OR BOUND CONSTRAINED OPTIMIZA TION" written by Richard H.Byrd   Peihuang Lu   Jorge Nocedal  and Ciyou Zhu Created by fanming.chen on 2015/3/7 0007. If StrongWolfeLineSearch(maxZoomIter,maxLineSearchIter) is small, the wolfeRuleSearch.minimize may throw FirstOrderException, it should increase the two variables to appropriate value

  23. trait LineSearch extends ApproximateLineSearch

    A line search optimizes a function of one variable without analytic gradient information.

    A line search optimizes a function of one variable without analytic gradient information. Differs only in whether or not it tries to find an exact minimizer

  24. class LineSearchFailed extends FirstOrderException
  25. case class MaxIterations(num: Int) extends OptimizationOption with Product with Serializable
  26. trait Minimizer[T, -F] extends AnyRef

    Anything that can minimize a function

  27. trait MinimizingLineSearch extends AnyRef
  28. class NaNHistory extends FirstOrderException
  29. class OWLQN[K, T] extends LBFGS[T] with SerializableLogging

    Implements the Orthant-wise Limited Memory QuasiNewton method, which is a variant of LBFGS that handles L1 regularization.

    Implements the Orthant-wise Limited Memory QuasiNewton method, which is a variant of LBFGS that handles L1 regularization.

    Paper is Andrew and Gao (2007) Scalable Training of L1-Regularized Log-Linear Models

  30. sealed trait OptimizationOption extends (OptParams) => OptParams

  31. trait OptimizationPackage[Function, Vector] extends AnyRef

  32. sealed trait OptimizationPackageLowPriority extends OptimizationPackageLowPriority2
  33. sealed trait OptimizationPackageLowPriority2 extends AnyRef
  34. class ProjectedQuasiNewton extends FirstOrderMinimizer[DenseVector[Double], DiffFunction[DenseVector[Double]]] with Projecting[DenseVector[Double]] with SerializableLogging
  35. trait Projecting[T] extends AnyRef
  36. trait SecondOrderFunction[T, H] extends DiffFunction[T]

    Represents a function for which we can easily compute the Hessian.

    Represents a function for which we can easily compute the Hessian.

    For conjugate gradient methods, you can play tricks with the hessian, returning an object that only supports multiplication.

  37. class SpectralProjectedGradient[T] extends FirstOrderMinimizer[T, DiffFunction[T]] with Projecting[T] with SerializableLogging

    SPG is a Spectral Projected Gradient minimizer; it minimizes a differentiable function subject to the optimum being in some set, given by the projection operator projection

    SPG is a Spectral Projected Gradient minimizer; it minimizes a differentiable function subject to the optimum being in some set, given by the projection operator projection

    T

    vector type

  38. class StepSizeOverflow extends FirstOrderException
  39. case class StepSizeScale(alpha: Double = 1.0) extends OptimizationOption with Product with Serializable
  40. class StepSizeUnderflow extends FirstOrderException
  41. class StochasticAveragedGradient[T] extends FirstOrderMinimizer[T, BatchDiffFunction[T]]

  42. trait StochasticDiffFunction[T] extends (T) => Double with NumericOps[StochasticDiffFunction[T]]

    A differentiable function whose output is not guaranteed to be the same across consecutive invocations.

  43. abstract class StochasticGradientDescent[T] extends FirstOrderMinimizer[T, StochasticDiffFunction[T]] with SerializableLogging

    Minimizes a function using stochastic gradient descent

  44. class StrongWolfeLineSearch extends CubicLineSearch
  45. case class Tolerance(fvalTolerance: Double = 1E-5, gvalTolerance: Double = 1e-6) extends OptimizationOption with Product with Serializable
  46. class TruncatedNewtonMinimizer[T, H] extends Minimizer[T, SecondOrderFunction[T, H]] with SerializableLogging

    Implements a TruncatedNewton Trust region method (like Tron).

    Implements a TruncatedNewton Trust region method (like Tron). Also implements "Hessian Free learning". We have a few extra tricks though... :)

Value Members

  1. def iterations[Objective, Vector, State](fn: Objective, init: Vector, options: OptimizationOption*)(implicit optimization: IterableOptimizationPackage[Objective, Vector, State]): Iterator[State]

    Returns a sequence of states representing the iterates of a solver, given an breeze.optimize.IterableOptimizationPackage that knows how to minimize The actual state class varies with the kind of function passed in.

    Returns a sequence of states representing the iterates of a solver, given an breeze.optimize.IterableOptimizationPackage that knows how to minimize The actual state class varies with the kind of function passed in. Typically, they have a .x value of type Vector that is the current point being evaluated, and .value is the current objective value

  2. def minimize[Objective, Vector](fn: Objective, init: Vector, options: OptimizationOption*)(implicit optimization: OptimizationPackage[Objective, Vector]): Vector

    Minimizes a function, given an breeze.optimize.OptimizationPackage that knows how to minimize

  3. object AdaptiveGradientDescent

    Implements the L2^2 and L1 updates from Duchi et al 2010 Adaptive Subgradient Methods for Online Learning and Stochastic Optimization.

    Implements the L2^2 and L1 updates from Duchi et al 2010 Adaptive Subgradient Methods for Online Learning and Stochastic Optimization.

    Basically, we use "forward regularization" and an adaptive step size based on the previous gradients.

  4. object BatchDiffFunction
  5. object DiffFunction extends DiffFunctionOpImplicits
  6. object EmpiricalHessian
  7. object FirstOrderMinimizer extends Serializable
  8. object FisherMatrix
  9. object GradientTester extends SerializableLogging

    Class that compares the computed gradient with an empirical gradient based on finite differences.

    Class that compares the computed gradient with an empirical gradient based on finite differences. Essential for debugging dynamic programs.

  10. object LBFGS extends Serializable
  11. object LBFGSB extends Serializable
  12. object LineSearch
  13. object OptimizationOption
  14. object OptimizationPackage extends OptimizationPackageLowPriority
  15. case object PreferBatch extends OptimizationOption with Product with Serializable
  16. case object PreferOnline extends OptimizationOption with Product with Serializable
  17. object ProjectedQuasiNewton extends SerializableLogging
  18. object RootFinding

    Root finding algorithms

  19. object SecondOrderFunction
  20. object StochasticGradientDescent extends Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped