breeze.optimize.FirstOrderMinimizer

State

case class State(x: T, value: Double, grad: T, adjustedValue: Double, adjustedGradient: T, iter: Int, initialAdjVal: Double, history: History, fVals: IndexedSeq[Double] = ..., numImprovementFailures: Int = 0, searchFailed: Boolean = false) extends Product with Serializable

Tracks the information about the optimizer, including the current point, its value, gradient, and then any history. Also includes information for checking convergence.

x

the current point being considered

value

f(x)

grad

f.gradientAt(x)

adjustedValue

f(x) + r(x), where r is any regularization added to the objective. For LBFGS, this is f(x).

adjustedGradient

f'(x) + r'(x), where r is any regularization added to the objective. For LBFGS, this is f'(x).

iter

what iteration number we are on.

initialAdjVal

f(x_0) + r(x_0), used for checking convergence

history

any information needed by the optimizer to do updates.

fVals

the sequence of the last minImprovementWindow values, used for checking if the "value" isn't improving

numImprovementFailures

the number of times in a row the objective hasn't improved, mostly for SGD

searchFailed

did the line search fail?

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. State
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new State(x: T, value: Double, grad: T, adjustedValue: Double, adjustedGradient: T, iter: Int, initialAdjVal: Double, history: History, fVals: IndexedSeq[Double] = ..., numImprovementFailures: Int = 0, searchFailed: Boolean = false)

    x

    the current point being considered

    value

    f(x)

    grad

    f.gradientAt(x)

    adjustedValue

    f(x) + r(x), where r is any regularization added to the objective. For LBFGS, this is f(x).

    adjustedGradient

    f'(x) + r'(x), where r is any regularization added to the objective. For LBFGS, this is f'(x).

    iter

    what iteration number we are on.

    initialAdjVal

    f(x_0) + r(x_0), used for checking convergence

    history

    any information needed by the optimizer to do updates.

    fVals

    the sequence of the last minImprovementWindow values, used for checking if the "value" isn't improving

    numImprovementFailures

    the number of times in a row the objective hasn't improved, mostly for SGD

    searchFailed

    did the line search fail?

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def actuallyConverged: Boolean

    true if the function value hasn't changed for several iterations or if the gradient's norm is near 0

    true if the function value hasn't changed for several iterations or if the gradient's norm is near 0

    returns

  7. val adjustedGradient: T

    f'(x) + r'(x), where r is any regularization added to the objective.

    f'(x) + r'(x), where r is any regularization added to the objective. For LBFGS, this is f'(x).

  8. val adjustedValue: Double

    f(x) + r(x), where r is any regularization added to the objective.

    f(x) + r(x), where r is any regularization added to the objective. For LBFGS, this is f(x).

  9. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  10. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. def converged: Boolean

    True if the optimizer thinks it's done.

  12. def convergedReason: Option[ConvergenceReason]

  13. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  14. val fVals: IndexedSeq[Double]

    the sequence of the last minImprovementWindow values, used for checking if the "value" isn't improving

  15. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  17. val grad: T

    f.

    f.gradientAt(x)

  18. val history: History

    any information needed by the optimizer to do updates.

  19. val initialAdjVal: Double

    f(x_0) + r(x_0), used for checking convergence

  20. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  21. val iter: Int

    what iteration number we are on.

  22. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  23. final def notify(): Unit

    Definition Classes
    AnyRef
  24. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  25. val numImprovementFailures: Int

    the number of times in a row the objective hasn't improved, mostly for SGD

  26. val searchFailed: Boolean

    did the line search fail?

  27. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  28. val value: Double

    f(x)

  29. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. val x: T

    the current point being considered

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped