A Layer that minimizes the computation during both forward pass and backward pass.
A Layer
represents a neural network.
Provides @Symbolic
annotation to create symbolic methods, in which you can create Layers from mathematical formulas.
A namespace of common operators for any layers.
A namespace of common operators for Boolean layers.
A namespace of common operators for Coproduct layers.
A namespace of common operators for Double layers.
A namespace of common operators for Float layers.
A namespace of common operators for HList layers.
A namespace of common operators for INDArray layers.
A namespace of common operators for Int layers.
A namespace of common operators for all layers.
A namespace of common operators for Seq layers.
A namespace of common math operators.
There are two ways to convert a value to Layer.
This is the documentation for the DeepLearning.Scala
Overview
BufferedLayer
,DifferentiableAny
,DifferentiableNothing
,Layer
,Poly
andSymbolic
are base packages which contains necessary operations , all other packages dependent on those base packages.If you want to implement a layer, you need to know how to use base packages.
Imports guidelines
If you want use some operations of Type T, you should import:
import com.thoughtworks.deeplearning.DifferentiableT._
it means: If you want use some operations of INDArray, you should import:
import com.thoughtworks.deeplearning.DifferentiableINDArray._
If you write something like this:
If compiler shows error :
you need add import this time :
import com.thoughtworks.deeplearning.DifferentiableINDArray._
If you write something like this:
If the compiler shows error:
you need add import :
If the compiler shows error:
you need add import:
Those
+
-
*
/
andlog
exp
abs
max
min
are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.Composability
Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.
Compose