Object/Class

breeze.optimize

FirstOrderMinimizer

Related Docs: class FirstOrderMinimizer | package optimize

Permalink

object FirstOrderMinimizer extends Serializable

Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. FirstOrderMinimizer
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. trait ConvergenceCheck[T] extends AnyRef

    Permalink
  2. trait ConvergenceReason extends AnyRef

    Permalink
  3. case class FunctionValuesConverged[T](tolerance: Double, relative: Boolean, historyLength: Int) extends ConvergenceCheck[T] with Product with Serializable

    Permalink
  4. case class MonitorFunctionValuesCheck[T](f: (T) ⇒ Double, numFailures: Int, improvementRequirement: Double, evalFrequency: Int) extends ConvergenceCheck[T] with SerializableLogging with Product with Serializable

    Permalink
  5. case class OptParams(batchSize: Int = 512, regularization: Double = 0.0, alpha: Double = 0.5, maxIterations: Int = 1000, useL1: Boolean = false, tolerance: Double = 1E-5, useStochastic: Boolean = false, randomSeed: Int = 0) extends Product with Serializable

    Permalink

    OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.

    OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.

    Configurations: 1) useStochastic=false,useL1=false: LBFGS with L2 regularization 2) useStochastic=false,useL1=true: OWLQN with L1 regularization 3) useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization 3) useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization

    batchSize

    size of batches to use if useStochastic and you give a BatchDiffFunction

    regularization

    regularization constant to use.

    alpha

    rate of change to use, only applies to SGD.

    useL1

    if true, use L1 regularization. Otherwise, use L2.

    tolerance

    convergence tolerance, looking at both average improvement and the norm of the gradient.

    useStochastic

    if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

  6. case class SequenceConvergenceCheck[T](checks: IndexedSeq[ConvergenceCheck[T]]) extends ConvergenceCheck[T] with Product with Serializable

    Permalink
  7. case class State[+T, +ConvergenceInfo, +History](x: T, value: Double, grad: T, adjustedValue: Double, adjustedGradient: T, iter: Int, initialAdjVal: Double, history: History, convergenceInfo: ConvergenceInfo, searchFailed: Boolean = false) extends Product with Serializable

    Permalink

    Tracks the information about the optimizer, including the current point, its value, gradient, and then any history.

    Tracks the information about the optimizer, including the current point, its value, gradient, and then any history. Also includes information for checking convergence.

    x

    the current point being considered

    value

    f(x)

    grad

    f.gradientAt(x)

    adjustedValue

    f(x) + r(x), where r is any regularization added to the objective. For LBFGS, this is f(x).

    adjustedGradient

    f'(x) + r'(x), where r is any regularization added to the objective. For LBFGS, this is f'(x).

    iter

    what iteration number we are on.

    initialAdjVal

    f(x_0) + r(x_0), used for checking convergence

    history

    any information needed by the optimizer to do updates.

    searchFailed

    did the line search fail?

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. object ConvergenceCheck

    Permalink
  5. object FunctionValuesConverged extends ConvergenceReason with Product with Serializable

    Permalink
  6. object GradientConverged extends ConvergenceReason with Product with Serializable

    Permalink
  7. object MaxIterations extends ConvergenceReason with Product with Serializable

    Permalink
  8. object MonitorFunctionNotImproving extends ConvergenceReason with Product with Serializable

    Permalink
  9. object ProjectedStepConverged extends ConvergenceReason with Product with Serializable

    Permalink
  10. object SearchFailed extends ConvergenceReason with Product with Serializable

    Permalink
  11. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  12. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  13. def defaultConvergenceCheck[T](maxIter: Int, tolerance: Double, relative: Boolean = true, fvalMemory: Int = 20)(implicit space: NormedModule[T, Double]): ConvergenceCheck[T]

    Permalink
  14. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  16. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. def functionValuesConverged[T](tolerance: Double = 1E-9, relative: Boolean = true, historyLength: Int = 10): ConvergenceCheck[T]

    Permalink
  18. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  19. def gradientConverged[T](tolerance: Double, relative: Boolean = true)(implicit space: NormedModule[T, Double]): ConvergenceCheck[T]

    Permalink
  20. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  21. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  22. def maxIterationsReached[T](maxIter: Int): ConvergenceCheck[T]

    Permalink
  23. def monitorFunctionValues[T](f: (T) ⇒ Double, numFailures: Int = 5, improvementRequirement: Double = 1E-2, evalFrequency: Int = 10): ConvergenceCheck[T]

    Permalink

    Runs the function, and if it fails to decreased by at least improvementRequirement numFailures times in a row, then we abort

    Runs the function, and if it fails to decreased by at least improvementRequirement numFailures times in a row, then we abort

    evalFrequency

    how often we run the evaluation

  24. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  25. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. def searchFailed[T]: ConvergenceCheck[T]

    Permalink
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  29. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  30. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped