Class | Description |
---|---|
ACos |
Log elementwise function
|
ACosh |
ACosh elementwise function
|
ASin |
Arcsin elementwise function
|
ASinh |
Arcsin elementwise function
|
ATan |
Arc Tangent elementwise function
|
ATanh |
tan elementwise function
|
Cos |
Cosine elementwise function
|
Cosh |
Cosine Hyperbolic elementwise function
|
ELU |
ELU: Exponential Linear Unit (alpha=1.0)
Introduced in paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter (2015) https://arxiv.org/abs/1511.07289 |
Erf |
Gaussian error function (erf) function, which is defined as
|
Erfc |
Complementary Gaussian error function (erfc), defined as
|
Exp |
Element-wise exponential function
|
Expm1 |
Element-wise exponential function minus 1, i.e.
|
GELU |
GELU activation function - Gaussian Error Linear Units
For more details, see Gaussian Error Linear Units (GELUs) - https://arxiv.org/abs/1606.08415 Note: This op implements both the sigmoid and tanh-based approximations; to use the sigmoid approximation (recommended) use precise=false; otherwise, use precise = true for the slower but marginally more accurate tanh version. |
GELUDerivative |
GELU derivative
|
HardSigmoid |
HardSigmoid function
|
HardTanh |
Hard tanh elementwise function
|
Log |
Log elementwise function
|
Log1p |
Log1p function
|
LogSigmoid |
LogSigmoid function
|
Mish |
Mish activation function
|
MishDerivative |
Mish derivative
|
PreciseGELU |
GELU activation function - Gaussian Error Linear Units
For more details, see Gaussian Error Linear Units (GELUs) - https://arxiv.org/abs/1606.08415 Note: This op implements both the sigmoid and tanh-based approximations; to use the sigmoid approximation (recommended) use precise=false; otherwise, use precise = true for the slower but marginally more accurate tanh version. |
PreciseGELUDerivative |
GELU derivative
|
RationalTanh |
Rational Tanh Approximation elementwise function, as described at https://github.com/deeplearning4j/libnd4j/issues/351
|
RectifiedTanh |
RectifiedTanh
Essentially max(0, tanh(x))
|
Rint |
Rint function
|
SELU |
SELU activation function
|
SetRange |
Set range to a particular set of values
|
Sigmoid |
Sigmoid function
|
SigmoidDerivative | Deprecated |
Sin |
Log elementwise function
|
Sinh |
Sinh function
|
SoftPlus | |
SoftSign |
Softsign element-wise activation function.
|
Stabilize |
Stabilization function, forces values to be within a range
|
Swish |
Swish function
|
SwishDerivative |
Swish derivative
|
Tan |
Tanh elementwise function
|
TanDerivative |
Tan Derivative elementwise function
|
Tanh |
Tanh elementwise function
|
TanhDerivative | Deprecated
Use
TanhDerivative . |
Copyright © 2020. All rights reserved.