Returns a Case that accepts two Float Layers.
Returns a Case that accepts two Float Layers.
The returned Case
is used by the polymorphic function *,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableFloat._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = { inputFloatLayer * anotherFloatLayer }
Returns a Case that accepts two Float Layers.
Returns a Case that accepts two Float Layers.
The returned Case
is used by the polymorphic function +,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableFloat._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = { Poly.MathMethods.+(inputFloatLayer,anotherFloatLayer) }
Returns a Case that accepts two Float Layers.
Returns a Case that accepts two Float Layers.
The returned Case
is used by the polymorphic function -,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableFloat._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = { Poly.MathMethods.-(inputFloatLayer,anotherFloatLayer) }
Returns a Case that accepts two Float Layers.
Returns a Case that accepts two Float Layers.
The returned Case
is used by the polymorphic function /,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableFloat._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = { Poly.MathMethods./(inputFloatLayer,anotherFloatLayer) }
Optimizers of Float.
Returns a Case that accepts Float Layer for the polymorphic function abs
Returns a Case that accepts Float Layer for the polymorphic function exp
Returns a Case that accepts Float Layer for the polymorphic function log
Returns a Case that accepts two Float Layers for the polymorphic function max
import com.thoughtworks.deeplearning.DifferentiableFloat._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = { Poly.MathFunctions.max(inputFloatLayer,anotherFloatLayer) }
Returns a Case that accepts two Float Layers for the polymorphic function min
import com.thoughtworks.deeplearning.DifferentiableFloat._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputFloatLayer: Float @Symbolic)(anotherFloatLayer: Float @Symbolic) = { Poly.MathFunctions.min(inputFloatLayer,anotherFloatLayer) }
Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.
Implicitly converts any layer to FloatLayerOps, which enables common methods for Float layers.
import com.thoughtworks.deeplearning.DifferentiableFloat._
A namespace of common operators for Float layers.