com.thoughtworks.deeplearning

DifferentiableFloat

object DifferentiableFloat

A namespace of common operators for Float layers.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. DifferentiableFloat
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. final class FloatLayerOps[Input <: Tape] extends AnyRef

  2. implicit final class NativeFloatOps extends AnyRef

  3. trait OptimizerFactory extends AnyRef

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. implicit def Float*Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function *, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        inputFloatLayer * anotherFloatLayer
      }
  7. implicit def Float+Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function +, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.+(inputFloatLayer,anotherFloatLayer)
      }
  8. implicit def Float-Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers. The returned Case is used by the polymorphic function -, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods.-(inputFloatLayer,anotherFloatLayer)
      }
  9. implicit def Float/Float[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers.

    Returns a Case that accepts two Float Layers.

    The returned Case is used by the polymorphic function /, which is called in MathOps.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathMethods./(inputFloatLayer,anotherFloatLayer)
      }
  10. object Layers

  11. object OptimizerFactory

  12. object Optimizers

    Optimizers of Float.

  13. implicit def abs(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts Float Layer for the polymorphic function abs

    Returns a Case that accepts Float Layer for the polymorphic function abs

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.abs(inputFloatLayer)
      }
  14. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  18. implicit def exp(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts Float Layer for the polymorphic function exp

    Returns a Case that accepts Float Layer for the polymorphic function exp

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.exp(inputFloatLayer)
      }
  19. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  20. implicit def floatToLiteral: Aux[Float, Float, Float]

  21. implicit def floatTrainable: Trainable[Float, Float]

    See also

    Trainable

  22. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  23. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  24. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  25. implicit def log(Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts Float Layer for the polymorphic function log

    Returns a Case that accepts Float Layer for the polymorphic function log

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.log(inputFloatLayer)
      }
  26. implicit def max(Float,Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers for the polymorphic function max

    Returns a Case that accepts two Float Layers for the polymorphic function max

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.max(inputFloatLayer,anotherFloatLayer)
      }
  27. implicit def min(Float,Float)[Input <: Tape]: Aux[Aux[Input, Tape], Aux[Input, Tape], Aux[Input, Tape]]

    Returns a Case that accepts two Float Layers for the polymorphic function min

    Returns a Case that accepts two Float Layers for the polymorphic function min

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
      import com.thoughtworks.deeplearning.Symbolic
      def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = {
        Poly.MathFunctions.min(inputFloatLayer,anotherFloatLayer)
      }
  28. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  29. final def notify(): Unit

    Definition Classes
    AnyRef
  30. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  31. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  32. implicit def toFloatLayerOps[From, Input <: Tape](from: From)(implicit toLayer: OfPlaceholder[From, Input, FloatPlaceholder]): FloatLayerOps[Input]

    Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.

    Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.

    Example:
    1. import com.thoughtworks.deeplearning.DifferentiableFloat._
  33. def toString(): String

    Definition Classes
    AnyRef → Any
  34. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped