Class

com.intel.analytics.zoo.pipeline.api.keras.optimizers

AdamWeightDecay

Related Doc: package optimizers

Permalink

class AdamWeightDecay[T] extends SGD[T]

Implements BERT version of Adam algorithm

Linear Supertypes
SGD[T], OptimMethod[T], Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AdamWeightDecay
  2. SGD
  3. OptimMethod
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AdamWeightDecay(lr: Double = 1e-3, warmupPortion: Double = 1, total: Int = 1, schedule: String = "linear", beta1: Double = 0.9, beta2: Double = 0.999, epsilon: Double = 1e-6, weightDecay: Double = 0.01)(implicit arg0: ClassTag[T], ev: TensorNumeric[T])

    Permalink

    lr

    learning rate

    warmupPortion

    portion of total for the warmup, -1 means no warmup. Default: -1

    total

    total number of training steps for the learning rate schedule, -1 means constant learning rate. Default: -1

    schedule

    schedule to use for the warmup. Default: 'linear'

    beta1

    first moment coefficient

    beta2

    second moment coefficient

    epsilon

    for numerical stability

    weightDecay

    weight decay

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. var beta1: Double

    Permalink

    first moment coefficient

  6. var beta2: Double

    Permalink

    second moment coefficient

  7. def clearHistory(): Unit

    Permalink
    Definition Classes
    AdamWeightDecay → SGD → OptimMethod
  8. def clone(): OptimMethod[T]

    Permalink
    Definition Classes
    OptimMethod → AnyRef
  9. var dampening: Double

    Permalink
    Definition Classes
    SGD
  10. var epsilon: Double

    Permalink

    for numerical stability

  11. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  15. def getHyperParameter(config: Table): String

    Permalink
    Definition Classes
    SGD → OptimMethod
  16. def getHyperParameter(): String

    Permalink
    Definition Classes
    SGD → OptimMethod
  17. def getLearningRate(): Double

    Permalink
    Definition Classes
    AdamWeightDecay → SGD → OptimMethod
  18. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  19. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  20. var learningRate: Double

    Permalink
    Definition Classes
    SGD
  21. var learningRateDecay: Double

    Permalink
    Definition Classes
    SGD
  22. var learningRateSchedule: LearningRateSchedule

    Permalink
    Definition Classes
    SGD
  23. var learningRates: Tensor[T]

    Permalink
    Definition Classes
    SGD
  24. def loadFromTable(config: Table): AdamWeightDecay.this.type

    Permalink
    Definition Classes
    AdamWeightDecay → SGD → OptimMethod
  25. var lr: Double

    Permalink

    learning rate

  26. var momentum: Double

    Permalink
    Definition Classes
    SGD
  27. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  28. var nesterov: Boolean

    Permalink
    Definition Classes
    SGD
  29. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  30. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  31. def optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T]): (Tensor[T], Array[T])

    Permalink

    feval

    a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX

    parameter

    the initial point

    returns

    the new x vector and the function list {fx}, evaluated before the update

    Definition Classes
    AdamWeightDecay → SGD → OptimMethod
  32. def save(path: String, overWrite: Boolean): AdamWeightDecay.this.type

    Permalink
    Definition Classes
    OptimMethod
  33. var schedule: String

    Permalink

    schedule to use for the warmup.

    schedule to use for the warmup. Default: 'linear'

  34. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  35. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  36. var total: Int

    Permalink

    total number of training steps for the learning rate schedule, -1 means constant learning rate.

    total number of training steps for the learning rate schedule, -1 means constant learning rate. Default: -1

  37. def updateHyperParameter(timeStep: Double): Double

    Permalink
  38. def updateHyperParameter(config: Table, state: Table): Unit

    Permalink
    Definition Classes
    SGD → OptimMethod
  39. def updateHyperParameter(): Unit

    Permalink
    Definition Classes
    SGD → OptimMethod
  40. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  42. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. def warmupMethod(x: Double, warmup: Double = 0.002): Double

    Permalink
  44. var warmupPortion: Double

    Permalink

    portion of total for the warmup, -1 means no warmup.

    portion of total for the warmup, -1 means no warmup. Default: -1

  45. var weightDecay: Double

    Permalink

    weight decay

    weight decay

    Definition Classes
    SGD
  46. var weightDecays: Tensor[T]

    Permalink
    Definition Classes
    SGD

Deprecated Value Members

  1. def clearHistory(state: Table): Table

    Permalink
    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please use clearHistory() instead

  2. def optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T], config: Table, state: Table): (Tensor[T], Array[T])

    Permalink
    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table

Inherited from SGD[T]

Inherited from OptimMethod[T]

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped