Class | Description |
---|---|
Abs |
Abs elementwise function
|
ACos |
Log elementwise function
|
ASin |
Arcsin elementwise function
|
ATan |
Arc Tangent elementwise function
|
Ceil |
Ceiling elementwise function
|
Cos |
Cosine elementwise function
|
Cube |
Cube (x^3) elementwise function
|
CubeDerivative |
Cube derivative, e.g.
|
ELU |
ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015) http://arxiv.org/abs/1511.07289 |
ELUDerivative |
Derivative of ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015) http://arxiv.org/abs/1511.07289 |
Exp |
Exp elementwise function
|
Floor |
Floor elementwise function
|
HardTanh |
Hard tanh elementwise function
|
HardTanhDerivative |
Hard tanh elementwise derivative function
|
Histogram | |
Identity |
Identity function
|
IsMax |
[1, 2, 3, 1] -> [0, 0, 1, 0]
|
LeakyReLU |
Leaky Rectified linear unit.
|
LeakyReLUDerivative |
Leaky ReLU derivative.
|
LegacyDropOut |
DropOut implementation as Op
PLEASE NOTE: This is legacy DropOut implementation, please consider using op with the same name from randomOps
|
LegacyDropOutInverted |
Inverted DropOut implementation as Op
PLEASE NOTE: This is legacy DropOutInverted implementation, please consider using op with the same name from randomOps
|
Log |
Log elementwise function
|
LogSoftMax |
Log(softmax(X))
|
MaxOut |
Max out activation:
http://arxiv.org/pdf/1302.4389.pdf
|
Negative |
Negative function
|
OneMinus |
1 - input
|
Ones |
Ones (represents a constant)
|
Pow |
Pow function
|
RectifedLinear |
Rectified linear units
|
ReplaceNans |
Element-wise "Replace NaN" implementation as Op
|
Round |
Rounding function
|
Set |
Set
|
SetRange |
Set range to a particular set of values
|
Sigmoid |
Sigmoid function
|
SigmoidDerivative |
Sigmoid derivative
|
Sign |
Signum function
|
Sin |
Log elementwise function
|
SoftMax |
Soft max function
row_maxes is a row vector (max for each row)
row_maxes = rowmaxes(input)
diff = exp(input - max) / diff.rowSums()
Outputs a probability distribution.
|
SoftMaxDerivative |
Softmax derivative
|
SoftPlus | |
SoftSign |
Softsign element-wise activation function.
|
SoftSignDerivative |
SoftSign derivative.
|
Sqrt |
Sqrt function
|
Stabilize |
Stabilization function, forces values to be within a range
|
Step |
Unit step function.
|
Tanh |
Tanh elementwise function
|
TanhDerivative |
Tanh derivative
|
TimesOneMinus |
If x is input: output is x*(1-x)
|
VectorFFT |
Encapsulated vector operation
|
VectorIFFT |
Single ifft operation
|
Copyright © 2016. All Rights Reserved.