Class Activation
- java.lang.Object
-
- ai.djl.nn.Activation
-
public final class Activation extends java.lang.Object
Utility class that provides activation functions and blocks.Many networks make use of the
Linear
block and other similar linear transformations. However, any number of linear transformations that are composed will only result in a different linear transformation (\($f(x) = W_2(W_1x) = (W_2W_1)x = W_{combined}x\)). In order to represent non-linear data, non-linear functions called activation functions are interspersed between the linear transformations. This allows the network to represent non-linear functions of increasing complexity.See wikipedia for more details.
-
-
Method Summary
All Methods Static Methods Concrete Methods Modifier and Type Method Description static NDArray
elu(NDArray array, float alpha)
Applies ELU activation on the inputNDArray
.static NDList
elu(NDList arrays, float alpha)
Applies ELU(Exponential Linear Unit) activation on the input singletonNDList
.static Block
eluBlock(float alpha)
Creates aLambdaBlock
that applies theELU
activation function in its forward function.static NDArray
gelu(NDArray array)
Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray
.static NDList
gelu(NDList arrays)
Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList
.static Block
geluBlock()
Creates aLambdaBlock
that applies theGELU
activation function in its forward function.static NDArray
leakyRelu(NDArray array, float alpha)
Applies Leaky ReLU activation on the inputNDArray
.static NDList
leakyRelu(NDList arrays, float alpha)
Applies Leaky ReLU activation on the input singletonNDList
.static Block
leakyReluBlock(float alpha)
Creates aLambdaBlock
that applies theLeakyReLU
activation function in its forward function.static NDArray
mish(NDArray array)
Applies Mish activation on the inputNDArray
.static NDList
mish(NDList arrays)
Applies Mish activation on the input singletonNDList
.static Block
mishBlock()
Creates aLambdaBlock
that applies theMish
activation function in its forward function.static Block
preluBlock()
Returns aPrelu
block.static NDArray
relu(NDArray array)
Applies ReLU activation on the inputNDArray
.static NDList
relu(NDList arrays)
Applies ReLU activation on the input singletonNDList
.static NDList
relu6(NDList arrays)
Applies ReLU6 activation on the input singletonNDList
.static Block
relu6Block()
Creates aLambdaBlock
that applies theReLU6
activation function in its forward function.static Block
reluBlock()
Creates aLambdaBlock
that applies theReLU
activation function in its forward function.static NDArray
selu(NDArray array)
Applies Scaled ELU activation on the inputNDArray
.static NDList
selu(NDList arrays)
Applies Scaled ELU activation on the input singletonNDList
.static Block
seluBlock()
Creates aLambdaBlock
that applies theSELU
activation function in its forward function.static NDArray
sigmoid(NDArray array)
Applies Sigmoid activation on the inputNDArray
.static NDList
sigmoid(NDList arrays)
Applies Sigmoid activation on the input singletonNDList
.static Block
sigmoidBlock()
Creates aLambdaBlock
that applies theSigmoid
activation function in its forward function.static NDArray
softPlus(NDArray array)
Applies softPlus activation on the inputNDArray
.static NDList
softPlus(NDList arrays)
Applies softPlus activation on the input singletonNDList
.static Block
softPlusBlock()
Creates aLambdaBlock
that applies thesoftPlus(NDList)
activation function in its forward function.static NDArray
softSign(NDArray array)
Applies softSign activation on the inputNDArray
.static NDList
softSign(NDList arrays)
Applies softPlus activation on the input singletonNDList
.static Block
softSignBlock()
Creates aLambdaBlock
that applies thesoftSign(NDList)
activation function in its forward function.static NDArray
swish(NDArray array, float beta)
Applies Swish activation on the inputNDArray
.static NDList
swish(NDList arrays, float beta)
Applies SWish activation on the input singletonNDList
.static Block
swishBlock(float beta)
Creates aLambdaBlock
that applies theSwish
activation function in its forward function.static NDArray
tanh(NDArray array)
Applies Tanh activation on the inputNDArray
.static NDList
tanh(NDList arrays)
Applies Tanh activation on the input singletonNDList
.static Block
tanhBlock()
Creates aLambdaBlock
that applies theTanh
activation function in its forward function.
-
-
-
Method Detail
-
relu
public static NDArray relu(NDArray array)
Applies ReLU activation on the inputNDArray
.ReLU is defined by: \( y = max(0, x) \)
-
relu
public static NDList relu(NDList arrays)
Applies ReLU activation on the input singletonNDList
.ReLU is defined by: \( y = max(0, x) \)
-
relu6
public static NDList relu6(NDList arrays)
Applies ReLU6 activation on the input singletonNDList
.ReLU is defined by: \( y = min(6,max(0, x)) \)
-
sigmoid
public static NDArray sigmoid(NDArray array)
Applies Sigmoid activation on the inputNDArray
.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
sigmoid
public static NDList sigmoid(NDList arrays)
Applies Sigmoid activation on the input singletonNDList
.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
tanh
public static NDArray tanh(NDArray array)
Applies Tanh activation on the inputNDArray
.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
tanh
public static NDList tanh(NDList arrays)
Applies Tanh activation on the input singletonNDList
.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
softPlus
public static NDArray softPlus(NDArray array)
Applies softPlus activation on the inputNDArray
.softPlus is defined by: \( y = log(1 + e^x) \)
-
softPlus
public static NDList softPlus(NDList arrays)
Applies softPlus activation on the input singletonNDList
.softPlus is defined by: \( y = log(1 + e^x) \)
-
softSign
public static NDArray softSign(NDArray array)
Applies softSign activation on the inputNDArray
.softPlus is defined by: \( y = x / 1 + |x| \)
-
softSign
public static NDList softSign(NDList arrays)
Applies softPlus activation on the input singletonNDList
.softPlus is defined by: \( y = x / 1 + |x| \)
-
leakyRelu
public static NDArray leakyRelu(NDArray array, float alpha)
Applies Leaky ReLU activation on the inputNDArray
.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
leakyRelu
public static NDList leakyRelu(NDList arrays, float alpha)
Applies Leaky ReLU activation on the input singletonNDList
.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
elu
public static NDArray elu(NDArray array, float alpha)
Applies ELU activation on the inputNDArray
.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
elu
public static NDList elu(NDList arrays, float alpha)
Applies ELU(Exponential Linear Unit) activation on the input singletonNDList
.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
selu
public static NDArray selu(NDArray array)
Applies Scaled ELU activation on the inputNDArray
.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717\)
-
selu
public static NDList selu(NDList arrays)
Applies Scaled ELU activation on the input singletonNDList
.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717 \)
-
gelu
public static NDArray gelu(NDArray array)
Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray
.
-
gelu
public static NDList gelu(NDList arrays)
Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList
.
-
swish
public static NDArray swish(NDArray array, float beta)
Applies Swish activation on the inputNDArray
.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
swish
public static NDList swish(NDList arrays, float beta)
Applies SWish activation on the input singletonNDList
.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
mish
public static NDArray mish(NDArray array)
Applies Mish activation on the inputNDArray
.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
mish
public static NDList mish(NDList arrays)
Applies Mish activation on the input singletonNDList
.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
reluBlock
public static Block reluBlock()
Creates aLambdaBlock
that applies theReLU
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theReLU
activation function
-
relu6Block
public static Block relu6Block()
Creates aLambdaBlock
that applies theReLU6
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theReLU
activation function
-
sigmoidBlock
public static Block sigmoidBlock()
Creates aLambdaBlock
that applies theSigmoid
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theSigmoid
activation function
-
tanhBlock
public static Block tanhBlock()
Creates aLambdaBlock
that applies theTanh
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theTanh
activation function
-
softPlusBlock
public static Block softPlusBlock()
Creates aLambdaBlock
that applies thesoftPlus(NDList)
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies thesoftPlus(NDList)
activation function
-
softSignBlock
public static Block softSignBlock()
Creates aLambdaBlock
that applies thesoftSign(NDList)
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies thesoftSign(NDList)
activation function
-
leakyReluBlock
public static Block leakyReluBlock(float alpha)
Creates aLambdaBlock
that applies theLeakyReLU
activation function in its forward function.- Parameters:
alpha
- the slope for the activation- Returns:
- the
LambdaBlock
that applies theLeakyReLU
activation function
-
eluBlock
public static Block eluBlock(float alpha)
Creates aLambdaBlock
that applies theELU
activation function in its forward function.- Parameters:
alpha
- the slope for the activation- Returns:
- the
LambdaBlock
that applies theELU
activation function
-
seluBlock
public static Block seluBlock()
Creates aLambdaBlock
that applies theSELU
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theSELU
activation function
-
geluBlock
public static Block geluBlock()
Creates aLambdaBlock
that applies theGELU
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theGELU
activation function
-
swishBlock
public static Block swishBlock(float beta)
Creates aLambdaBlock
that applies theSwish
activation function in its forward function.- Parameters:
beta
- a hyper-parameter- Returns:
- the
LambdaBlock
that applies theSwish
activation function
-
mishBlock
public static Block mishBlock()
Creates aLambdaBlock
that applies theMish
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theMish
activation function
-
-