Returns a Case that accepts two Double Layers.
Returns a Case that accepts two Double Layers.
The returned Case
is used by the polymorphic function *,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableDouble._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputDoubleLayer: Double @Symbolic)(anotherDoubleLayer: Double @Symbolic) = { inputDoubleLayer * anotherDoubleLayer }
Returns a Case that accepts two Double Layers.
Returns a Case that accepts two Double Layers.
The returned Case
is used by the polymorphic function +,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableDouble._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputDoubleLayer: Double @Symbolic)(anotherDoubleLayer: Double @Symbolic) = { Poly.MathMethods.+(inputDoubleLayer,anotherDoubleLayer) }
Returns a Case that accepts two Double Layers.
Returns a Case that accepts two Double Layers.
The returned Case
is used by the polymorphic function -,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableDouble._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputDoubleLayer: Double @Symbolic)(anotherDoubleLayer: Double @Symbolic) = { Poly.MathMethods.-(inputDoubleLayer,anotherDoubleLayer) }
Returns a Case that accepts two Double Layers.
Returns a Case that accepts two Double Layers.
The returned Case
is used by the polymorphic function /,
which is called in MathOps.
import com.thoughtworks.deeplearning.DifferentiableDouble._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputDoubleLayer: Double @Symbolic)(anotherDoubleLayer: Double @Symbolic) = { Poly.MathMethods./(inputDoubleLayer,anotherDoubleLayer) }
Optimizers of Double.
Optimizers of Double.
implicit val optimizerFactory = new DifferentiableDouble.OptimizerFactory { override def doubleOptimizer(weight: Weight): Optimizer = { new LearningRate with L2Regularization { var learningRate = 0.00003 override protected def l2Regularization: Double = 0.003 override protected def currentLearningRate(): Double = { learningRate * 0.75 learningRate } } } }
Returns a Case that accepts Double Layer for the polymorphic function abs
Returns a Case that accepts Double Layer for the polymorphic function exp
Returns a Case that accepts Double Layer for the polymorphic function log
Returns a Case that accepts two Double Layers for the polymorphic function max
import com.thoughtworks.deeplearning.DifferentiableDouble._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputDoubleLayer: Double @Symbolic)(anotherDoubleLayer: Double @Symbolic) = { Poly.MathFunctions.max(inputDoubleLayer,anotherDoubleLayer) }
Returns a Case that accepts two Double Layers for the polymorphic function min
import com.thoughtworks.deeplearning.DifferentiableDouble._ import com.thoughtworks.deeplearning.Symbolic def myNetwork(implicit inputDoubleLayer: Double @Symbolic)(anotherDoubleLayer: Double @Symbolic) = { Poly.MathFunctions.min(inputDoubleLayer,anotherDoubleLayer) }
Implicitly converts any layer to DoubleLayerOps, which enables common methods for Double layers.
Implicitly converts any layer to DoubleLayerOps, which enables common methods for Double layers.
import com.thoughtworks.deeplearning.DifferentiableDouble._
A namespace of common operators for Double layers.
Author:
杨博 (Yang Bo) <[email protected]>