kr.ac.kaist.ir.deep

fn

package fn

Package for various functions.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. fn
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. trait Activation extends (ScalarMatrix) ⇒ ScalarMatrix with Serializable

    Trait that describes an activation function for each layer

    Trait that describes an activation function for each layer

    Because these activation functions can be shared, we recommend to make inherited one as an object.

  2. implicit class ActivationOp extends Serializable

    Defines transformation of new activation function.

  3. class AdaDelta extends WeightUpdater

    Algorithm: AdaDelta algorithm

    Algorithm: AdaDelta algorithm

    If you are trying to use this algorithm for your research, you should add a reference to AdaDelta techinical report.

    Example:
    1. val algorithm = new AdaDelta(l2decay = 0.0001)
  4. class AdaGrad extends WeightUpdater

    Algorithm: AdaGrad algorithm.

    Algorithm: AdaGrad algorithm.

    If you are trying to use this algorithm for your research, you should add a reference to AdaGrad paper.

    Example:
    1. val algorithm = new AdaGrad(l2decay = 0.0001)
  5. trait Objective extends (ScalarMatrix, ScalarMatrix) ⇒ Scalar with Serializable

    Trait that describes an objective function for entire network

    Trait that describes an objective function for entire network

    Because these objective functions can be shared, we recommend to make inherited one as an object.

  6. type Probability = Float

    Type of probability *

  7. implicit class ProbabilityOp extends AnyRef

    Defines sugar operations of probability

  8. type Scalar = Float

    Type of scalar *

  9. type ScalarMatrix = DenseMatrix[Scalar]

    Type of Neuron Input *

  10. implicit class ScalarMatrixOp extends AnyRef

    Defines sugar operations for ScalarMatrix

  11. class StochasticGradientDescent extends WeightUpdater

    Algorithm: Stochastic Gradient Descent

    Algorithm: Stochastic Gradient Descent

    Basic Gradient Descent rule with mini-batch training.

    Example:
    1. val algorithm = new StochasticGradientDescent(l2decay = 0.0001)
  12. implicit class WeightSeqOp extends AnyRef

    Defines sugar operations of sequence of weights

  13. trait WeightUpdater extends (IndexedSeq[ScalarMatrix], IndexedSeq[ScalarMatrix]) ⇒ Unit with Serializable

    Trait that describes the algorithm for weight update

    Trait that describes the algorithm for weight update

    Because each weight update requires history, we recommend to make inherited one as a class.

Value Members

  1. object CosineErr extends Objective

    Objective Function: Cosine Similarity Error

    Objective Function: Cosine Similarity Error

    This has a heavy computation. If you want to use lighter one, use DotProductErr

    Example:
    1. val output = net(input)
              val err = CosineErr(real, output)
              val diff = CosineErr.derivative(real, output)
    Note

    This function returns 1 - cosine similarity, i.e. cosine dissimiarlity.

  2. object CrossEntropyErr extends Objective

    Objective Function: Sum of Cross-Entropy (Logistic)

    Objective Function: Sum of Cross-Entropy (Logistic)

    Example:
    1. val output = net(input)
              val err = CrossEntropyErr(real, output)
              val diff = CrossEntropyErr.derivative(real, output)
    Note

    This objective function prefer 0/1 output

  3. object DotProductErr extends Objective

    Objective Function: Dot-product Error

    Objective Function: Dot-product Error

    Example:
    1. val output = net(input)
              val err = DotProductErr(real, output)
              val diff = DotProductErr.derivative(real, output)
    Note

    This function computes additive inverse of dot product, i.e. dot-product dissimiarity.

  4. object HardSigmoid extends Activation

    Activation Function: Hard version of Sigmoid

    Activation Function: Hard version of Sigmoid

    Example:
    1. val fx = HardSigmoid(0.0)
               val diff = HardSigmoid.derivative(fx)
    Note

    sigmoid(x) = 1 / [exp(-x) + 1], hard version approximates tanh as piecewise linear function (derived from relationship between tanh & sigmoid, and tanh & hard tanh.)

  5. object HardTanh extends Activation

    Activation Function: Hard version of Tanh (Hyperbolic Tangent)

    Activation Function: Hard version of Tanh (Hyperbolic Tangent)

    Example:
    1. val fx = HardTanh(0.0)
               val diff = HardTanh.derivative(fx)
    Note

    tanh(x) = sinh(x) / cosh(x), hard version approximates tanh as piecewise linear function.

  6. object HyperbolicTangent extends Activation

    Activation Function: Tanh (Hyperbolic Tangent)

    Activation Function: Tanh (Hyperbolic Tangent)

    Example:
    1. val fx = HyperbolicTangent(0.0)
              val diff = HyperbolicTangent.derivative(fx)
    Note

    tanh(x) = sinh(x) / cosh(x)

  7. object Linear extends Activation

    Activation Function: Linear

    Activation Function: Linear

    Example:
    1. val fx = Linear(0.0)
                        val diff = Linear.derivative(fx)
    Note

    linear(x) = x

  8. object ManhattanErr extends Objective

    Objective Function: Sum of Absolute Error

    Objective Function: Sum of Absolute Error

    Example:
    1. val output = net(input)
               val err = ManhattanErr(real, output)
               val diff = ManhattanErr.derivative(real, output)
    Note

    In mathematics, L1-distance is called Manhattan distance.

  9. object Rectifier extends Activation

    Activation Function: Rectifier

    Activation Function: Rectifier

    Example:
    1. val fx = Rectifier(0.0)
              val diff = Rectifier.derivative(fx)
    Note

    rectifier(x) = x if x > 0, otherwise 0

  10. object ScalarMatrix

    Companion Object of ScalarMatrix

    Companion Object of ScalarMatrix

    This object defines various shortcuts.

  11. object ScalarMatrixSerializer extends Serializer[ScalarMatrix]

    Kryo Serializer for ScalarMatrix.

  12. object Sigmoid extends Activation

    Activation Function: Sigmoid function

    Activation Function: Sigmoid function

    Example:
    1. val fx = Sigmoid(0.0)
              val diff = Sigmoid.derivative(fx)
    Note

    sigmoid(x) = 1 / [exp(-x) + 1]
  13. object Softplus extends Activation

    Activation Function: Softplus

    Activation Function: Softplus

    Example:
    1. val fx = Softplus(0.0)
              val diff = Softplus.derivative(fx)
    Note

    softplus(x) = log[1 + exp(x)]

  14. object SquaredErr extends Objective

    Objective Function: Sum of Squared Error

    Objective Function: Sum of Squared Error

    Example:
    1. val output = net(input)
              val err = SquaredErr(real, output)
              val diff = SquaredErr.derivative(real, output)
  15. val Tanh: HyperbolicTangent.type

    Define Alias *

Inherited from AnyRef

Inherited from Any

Ungrouped