com.thoughtworks

deeplearning

package deeplearning

This is the documentation for the DeepLearning.Scala

Overview

BufferedLayer, DifferentiableAny, DifferentiableNothing, Layer, Poly and Symbolic are base packages which contains necessary operations , all other packages dependent on those base packages.

If you want to implement a layer, you need to know how to use base packages.

Imports guidelines

If you want use some operations of Type T, you should import:

import com.thoughtworks.deeplearning.DifferentiableT._

it means: If you want use some operations of INDArray, you should import:

import com.thoughtworks.deeplearning.DifferentiableINDArray._

If you write something like this:

def softmax(implicit scores: INDArray @Symbolic): INDArray @Symbolic = {
val expScores = exp(scores)
expScores / expScores.sum(1)
}

If compiler shows error :

Could not infer implicit value for com.thoughtworks.deeplearning.Symbolic[org.nd4j.linalg.api.ndarray.INDArray]

you need add import this time :

import com.thoughtworks.deeplearning.DifferentiableINDArray._

If you write something like this:

def crossEntropyLossFunction(
  implicit pair: (INDArray :: INDArray :: HNil) @Symbolic)
: Double @Symbolic = {
 val score = pair.head
 val label = pair.tail.head
 -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean
}

If the compiler shows error:

value * is not a member of com.thoughtworks.deeplearning.Layer.Aux[com.thoughtworks.deeplearning.Layer.Tape.Aux[org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray],com.thoughtworks.deeplearning.DifferentiableINDArray.INDArrayPlaceholder.Tape]val bias = Nd4j.ones(numberOfOutputKernels).toWeight * 0.1...

you need add import :

import com.thoughtworks.deeplearning.Poly.MathMethods.*
import com.thoughtworks.deeplearning.DifferentiableINDArray._

If the compiler shows error:

not found: value log -(label * log(score * 0.9 + 0.1) + (1.0 - label) * log(1.0 - score * 0.9)).mean...

you need add import:

import com.thoughtworks.deeplearning.Poly.MathFunctions.*
import com.thoughtworks.deeplearning.DifferentiableINDArray._

Those + - * / and log exp abs max min are defined at MathMethods and MathFunctions,those method are been implemented at DifferentiableType,so you need to import the implicit of DifferentiableType.

Composability

Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

See also

Compose

Linear Supertypes
Content Hierarchy Learn more about scaladoc diagrams
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. deeplearning
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. trait CumulativeLayer extends Layer

    A Layer that minimizes the computation during both forward pass and backward pass.

  2. trait Layer extends AnyRef

    A Layer represents a neural network.

  3. trait Symbolic[NativeOutput] extends AnyRef

    Provides @Symbolic annotation to create symbolic methods, in which you can create Layers from mathematical formulas.

Value Members

  1. object CumulativeLayer

  2. object DifferentiableAny

    A namespace of common operators for any layers.

  3. object DifferentiableBoolean

    A namespace of common operators for Boolean layers.

  4. object DifferentiableCoproduct

    A namespace of common operators for Coproduct layers.

  5. object DifferentiableDouble

    A namespace of common operators for Double layers.

  6. object DifferentiableFloat

    A namespace of common operators for Float layers.

  7. object DifferentiableHList

    A namespace of common operators for HList layers.

  8. object DifferentiableINDArray

    A namespace of common operators for INDArray layers.

  9. object DifferentiableInt

    A namespace of common operators for Int layers.

  10. object DifferentiableNothing

    A namespace of common operators for all layers.

  11. object DifferentiableSeq

    A namespace of common operators for Seq layers.

  12. object Layer

  13. object Poly

    A namespace of common math operators.

  14. object Symbolic extends LowPrioritySymbolic

    There are two ways to convert a value to Layer.

Inherited from AnyRef

Inherited from Any

Ungrouped