public final class Activation
extends java.lang.Object
Many networks make use of the Linear
block and other similar linear
transformations. However, any number of linear transformations that are composed will only result
in a different linear transformation (\($f(x) = W_2(W_1x) = (W_2W_1)x = W_{combined}x\)). In
order to represent non-linear data, non-linear functions called activation functions are
interspersed between the linear transformations. This allows the network to represent non-linear
functions of increasing complexity.
See wikipedia for more details.
Modifier and Type | Method and Description |
---|---|
static NDArray |
elu(NDArray array,
float alpha)
Applies ELU activation on the input
NDArray . |
static NDList |
elu(NDList arrays,
float alpha)
Applies ELU(Exponential Linear Unit) activation on the input singleton
NDList . |
static Block |
eluBlock(float alpha)
Creates a
LambdaBlock that applies the ELU activation
function in its forward function. |
static NDArray |
gelu(NDArray array)
Applies GELU(Gaussian Error Linear Unit) activation on the input
NDArray . |
static NDList |
gelu(NDList arrays)
Applies GELU(Gaussian Error Linear Unit) activation on the input singleton
NDList . |
static Block |
geluBlock()
Creates a
LambdaBlock that applies the GELU activation function
in its forward function. |
static NDArray |
leakyRelu(NDArray array,
float alpha)
Applies Leaky ReLU activation on the input
NDArray . |
static NDList |
leakyRelu(NDList arrays,
float alpha)
Applies Leaky ReLU activation on the input singleton
NDList . |
static Block |
leakyReluBlock(float alpha)
Creates a
LambdaBlock that applies the LeakyReLU
activation function in its forward function. |
static NDArray |
mish(NDArray array)
Applies Mish activation on the input
NDArray . |
static NDList |
mish(NDList arrays)
Applies Mish activation on the input singleton
NDList . |
static Block |
mishBlock()
Creates a
LambdaBlock that applies the Mish activation function
in its forward function. |
static Block |
preluBlock()
Returns a
Prelu block. |
static NDArray |
relu(NDArray array)
Applies ReLU activation on the input
NDArray . |
static NDList |
relu(NDList arrays)
Applies ReLU activation on the input singleton
NDList . |
static Block |
reluBlock()
Creates a
LambdaBlock that applies the ReLU activation function
in its forward function. |
static NDArray |
selu(NDArray array)
Applies Scaled ELU activation on the input
NDArray . |
static NDList |
selu(NDList arrays)
Applies Scaled ELU activation on the input singleton
NDList . |
static Block |
seluBlock()
Creates a
LambdaBlock that applies the SELU activation function
in its forward function. |
static NDArray |
sigmoid(NDArray array)
Applies Sigmoid activation on the input
NDArray . |
static NDList |
sigmoid(NDList arrays)
Applies Sigmoid activation on the input singleton
NDList . |
static Block |
sigmoidBlock()
Creates a
LambdaBlock that applies the Sigmoid activation
function in its forward function. |
static NDArray |
softPlus(NDArray array)
Applies softPlus activation on the input
NDArray . |
static NDList |
softPlus(NDList arrays)
Applies softPlus activation on the input singleton
NDList . |
static Block |
softPlusBlock()
Creates a
LambdaBlock that applies the SoftReLU activation
function in its forward function. |
static NDArray |
swish(NDArray array,
float beta)
Applies Swish activation on the input
NDArray . |
static NDList |
swish(NDList arrays,
float beta)
Applies SWish activation on the input singleton
NDList . |
static Block |
swishBlock(float beta)
Creates a
LambdaBlock that applies the Swish activation
function in its forward function. |
static NDArray |
tanh(NDArray array)
Applies Tanh activation on the input
NDArray . |
static NDList |
tanh(NDList arrays)
Applies Tanh activation on the input singleton
NDList . |
static Block |
tanhBlock()
Creates a
LambdaBlock that applies the Tanh activation function
in its forward function. |
public static NDArray relu(NDArray array)
NDArray
.
ReLU is defined by: \( y = max(0, x) \)
public static NDList relu(NDList arrays)
NDList
.
ReLU is defined by: \( y = max(0, x) \)
public static NDArray sigmoid(NDArray array)
NDArray
.
Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
public static NDList sigmoid(NDList arrays)
NDList
.
Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
public static NDArray tanh(NDArray array)
NDArray
.
Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
public static NDList tanh(NDList arrays)
NDList
.
Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
public static NDArray softPlus(NDArray array)
NDArray
.
softPlus is defined by: \( y = log(1 + e^x) \)
public static NDList softPlus(NDList arrays)
NDList
.
softPlus is defined by: \( y = log(1 + e^x) \)
public static NDArray leakyRelu(NDArray array, float alpha)
NDArray
.
Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
public static NDList leakyRelu(NDList arrays, float alpha)
NDList
.
Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
public static NDArray elu(NDArray array, float alpha)
NDArray
.
ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
public static NDList elu(NDList arrays, float alpha)
NDList
.
ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
public static NDArray selu(NDArray array)
NDArray
.
Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717\)
public static NDList selu(NDList arrays)
NDList
.
Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717 \)
public static NDArray gelu(NDArray array)
NDArray
.public static NDList gelu(NDList arrays)
NDList
.public static NDArray swish(NDArray array, float beta)
NDArray
.
Swish is defined as \(y = x * sigmoid(beta * x)\)
public static NDList swish(NDList arrays, float beta)
NDList
.
Swish is defined as \(y = x * sigmoid(beta * x)\)
public static NDArray mish(NDArray array)
NDArray
.
Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
public static NDList mish(NDList arrays)
NDList
.
Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
public static Block reluBlock()
LambdaBlock
that applies the ReLU
activation function
in its forward function.LambdaBlock
that applies the ReLU
activation
functionpublic static Block sigmoidBlock()
LambdaBlock
that applies the Sigmoid
activation
function in its forward function.LambdaBlock
that applies the Sigmoid
activation
functionpublic static Block tanhBlock()
LambdaBlock
that applies the Tanh
activation function
in its forward function.LambdaBlock
that applies the Tanh
activation
functionpublic static Block softPlusBlock()
LambdaBlock
that applies the SoftReLU
activation
function in its forward function.LambdaBlock
that applies the SoftReLU
activation functionpublic static Block leakyReluBlock(float alpha)
LambdaBlock
that applies the LeakyReLU
activation function in its forward function.alpha
- the slope for the activationLambdaBlock
that applies the LeakyReLU
activation functionpublic static Block eluBlock(float alpha)
LambdaBlock
that applies the ELU
activation
function in its forward function.alpha
- the slope for the activationLambdaBlock
that applies the ELU
activation
functionpublic static Block seluBlock()
LambdaBlock
that applies the SELU
activation function
in its forward function.LambdaBlock
that applies the SELU
activation
functionpublic static Block geluBlock()
LambdaBlock
that applies the GELU
activation function
in its forward function.LambdaBlock
that applies the GELU
activation
functionpublic static Block swishBlock(float beta)
LambdaBlock
that applies the Swish
activation
function in its forward function.beta
- a hyper-parameterLambdaBlock
that applies the Swish
activation functionpublic static Block mishBlock()
LambdaBlock
that applies the Mish
activation function
in its forward function.LambdaBlock
that applies the Mish
activation
function