import com.thoughtworks.deeplearning.plugins._
import com.thoughtworks.feature.Factory
val hyperparameters = Factory[DoubleTraining with ImplicitsSingleton with Operators with CumulativeDoubleLayers with DoubleWeights].newInstance()
import hyperparameters.implicits._
val weight1 = hyperparameters.DoubleWeight(10)
then the training result should be applied on it
weight1.train.map { result =>
result should be(10.0f)
weight1.data should be < 10.0f
}
import com.thoughtworks.deeplearning.plugins._
import com.thoughtworks.feature.Factory
val hyperparameters = Factory[DoubleTraining with ImplicitsSingleton with Operators with CumulativeDoubleLayers with DoubleWeights].newInstance()
import hyperparameters.implicits._
val weight1 = hyperparameters.DoubleWeight(10)
val weight2 = hyperparameters.DoubleWeight(300)
when adding them together,
val weight1PlusWeight2 = weight1 + weight2
then the training result should be applied on both weight
weight1PlusWeight2.train.map { result =>
result should be(310.0f)
weight2.data should be < 300.0f
weight1.data should be < 10.0f
}
Note
Unlike DoubleLayers, DoubleLayer in this CumulativeDoubleLayers will share Tapes
created in forward pass pass for all dependencies, avoiding re-evaluation
in the case of diamond dependencies in a neural network.
A plugin that provides differentiable operators on neural networks whose Data and Delta is scala.Double.
Author:
杨博 (Yang Bo)
Given a DoubleWeight,
then the training result should be applied on it
Given two DoubleWeights,
when adding them together,
val weight1PlusWeight2 = weight1 + weight2
then the training result should be applied on both weight
Unlike DoubleLayers, DoubleLayer in this
CumulativeDoubleLayers
will share Tapes created in forward pass pass for all dependencies, avoiding re-evaluation in the case of diamond dependencies in a neural network.