Modifier and Type | Class and Description |
---|---|
class |
BaseTransformOp
A base op for basic getters and setters
|
Modifier and Type | Method and Description |
---|---|
TransformOp |
TransformOp.derivative()
The derivative operation for this op
|
TransformOp |
BaseTransformOp.derivative() |
Modifier and Type | Method and Description |
---|---|
INDArray |
OpExecutioner.execAndReturn(TransformOp op)
Execute a TransformOp and return the result
|
INDArray |
DefaultOpExecutioner.execAndReturn(TransformOp op) |
Modifier and Type | Method and Description |
---|---|
TransformOp |
DefaultOpFactory.createTransform(String name,
INDArray x) |
TransformOp |
OpFactory.createTransform(String name,
INDArray x) |
TransformOp |
DefaultOpFactory.createTransform(String name,
INDArray x,
INDArray y) |
TransformOp |
OpFactory.createTransform(String name,
INDArray x,
INDArray y) |
TransformOp |
DefaultOpFactory.createTransform(String name,
INDArray x,
INDArray y,
INDArray z) |
TransformOp |
OpFactory.createTransform(String name,
INDArray x,
INDArray y,
INDArray z) |
TransformOp |
DefaultOpFactory.createTransform(String name,
INDArray x,
Object[] extraArgs) |
TransformOp |
OpFactory.createTransform(String name,
INDArray x,
Object[] extraArgs) |
Modifier and Type | Class and Description |
---|---|
class |
Abs
Abs elementwise function
|
class |
ACos
Log elementwise function
|
class |
ASin
Arcsin elementwise function
|
class |
ATan
Arc Tangent elementwise function
|
class |
Ceil
Ceiling elementwise function
|
class |
Cos
Cosine elementwise function
|
class |
DropOut
Inverted DropOut implementation as Op
|
class |
DropOutInverted
Inverted DropOut implementation as Op
|
class |
ELU
ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015) http://arxiv.org/abs/1511.07289 |
class |
ELUDerivative
Derivative of ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015) http://arxiv.org/abs/1511.07289 |
class |
Exp
Exp elementwise function
|
class |
Floor
Floor elementwise function
|
class |
HardTanh
Hard tanh elementwise function
|
class |
HardTanhDerivative
Hard tanh elementwise derivative function
|
class |
Identity
Identity function
|
class |
IsMax
[1, 2, 3, 1] -> [0, 0, 1, 0]
|
class |
LeakyReLU
Leaky Rectified linear unit.
|
class |
LeakyReLUDerivative
Leaky ReLU derivative.
|
class |
Log
Log elementwise function
|
class |
LogSoftMax
Log(softmax(X))
|
class |
MaxOut
Max out activation:
http://arxiv.org/pdf/1302.4389.pdf
|
class |
Negative
Negative function
|
class |
OneMinus
1 - input
|
class |
Ones
Ones (represents a constant)
|
class |
Pow
Pow function
|
class |
RectifedLinear
Rectified linear units
|
class |
ReplaceNans
Element-wise "Replace NaN" implementation as Op
|
class |
Round
Rounding function
|
class |
Set
Set
|
class |
SetRange
Set range to a particular set of values
|
class |
Sigmoid
Sigmoid function
|
class |
SigmoidDerivative
Sigmoid derivative
|
class |
Sign
Signum function
|
class |
Sin
Log elementwise function
|
class |
SoftMax
Soft max function
row_maxes is a row vector (max for each row)
row_maxes = rowmaxes(input)
diff = exp(input - max) / diff.rowSums()
Outputs a probability distribution.
|
class |
SoftMaxDerivative
Softmax derivative
|
class |
SoftPlus |
class |
SoftSign
Softsign element-wise activation function.
|
class |
SoftSignDerivative
SoftSign derivative.
|
class |
Sqrt
Sqrt function
|
class |
Stabilize
Stabilization function, forces values to be within a range
|
class |
Step
Unit step function.
|
class |
Tanh
Tanh elementwise function
|
class |
TanhDerivative
Tanh derivative
|
class |
TimesOneMinus
If x is input: output is x*(1-x)
|
class |
VectorFFT
Encapsulated vector operation
|
class |
VectorIFFT
Single ifft operation
|
Modifier and Type | Method and Description |
---|---|
TransformOp |
LeakyReLU.derivative() |
TransformOp |
Sin.derivative() |
TransformOp |
Tanh.derivative() |
TransformOp |
SoftPlus.derivative() |
TransformOp |
Sigmoid.derivative() |
TransformOp |
HardTanh.derivative() |
TransformOp |
ELU.derivative() |
TransformOp |
SoftSign.derivative() |
TransformOp |
RectifedLinear.derivative() |
TransformOp |
SoftMax.derivative() |
TransformOp |
Exp.derivative() |
Modifier and Type | Class and Description |
---|---|
class |
AddOp
Addition operation
|
class |
Axpy
Level 1 blas op Axpy as libnd4j native op
|
class |
CopyOp
Copy operation
|
class |
DivOp
Division operation
|
class |
MulOp
Multiplication operation
|
class |
RDivOp
Division operation
|
class |
RSubOp
Subtraction operation
|
class |
SubOp
Subtraction operation
|
Modifier and Type | Class and Description |
---|---|
class |
CompareAndSet
Element-wise Compare-and-set implementation as Op
|
class |
Eps
Bit mask over the ndarrays as to whether
the components are equal or not
|
class |
EqualTo
Bit mask over the ndarrays as to whether
the components are equal or not
|
class |
GreaterThan
Bit mask over the ndarrays as to whether
the components are greater than or not
|
class |
GreaterThanOrEqual
Bit mask over the ndarrays as to whether
the components are greater than or equal or not
|
class |
LessThan
Bit mask over the ndarrays as to whether
the components are less than or not
|
class |
LessThanOrEqual
Bit mask over the ndarrays as to whether
the components are less than or equal or not
|
class |
Max
Max function
|
class |
Min
Min function
|
class |
NotEqualTo
Not equal to function:
Bit mask over whether 2 elements are not equal or not
|
Modifier and Type | Class and Description |
---|---|
class |
Col2Im
Created by agibsonccc on 3/9/16.
|
class |
Im2col
Im 2 col operation
|
Copyright © 2016. All Rights Reserved.