Class Activation
Many networks make use of the Linear
block and other similar linear
transformations. However, any number of linear transformations that are composed will only result
in a different linear transformation (\($f(x) = W_2(W_1x) = (W_2W_1)x = W_{combined}x\)). In
order to represent non-linear data, non-linear functions called activation functions are
interspersed between the linear transformations. This allows the network to represent non-linear
functions of increasing complexity.
See wikipedia for more details.
-
Method Summary
Modifier and TypeMethodDescriptionstatic NDArray
Applies ELU activation on the inputNDArray
.static NDList
Applies ELU(Exponential Linear Unit) activation on the input singletonNDList
.static Block
eluBlock
(float alpha) Creates aLambdaBlock
that applies theELU
activation function in its forward function.static NDArray
Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray
.static NDList
Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies theGELU
activation function in its forward function.static NDArray
Applies Leaky ReLU activation on the inputNDArray
.static NDList
Applies Leaky ReLU activation on the input singletonNDList
.static Block
leakyReluBlock
(float alpha) Creates aLambdaBlock
that applies theLeakyReLU
activation function in its forward function.static NDArray
Applies Mish activation on the inputNDArray
.static NDList
Applies Mish activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies theMish
activation function in its forward function.static Block
Returns aPrelu
block.static NDArray
Applies ReLU activation on the inputNDArray
.static NDList
Applies ReLU activation on the input singletonNDList
.static NDArray
Applies ReLU6 activation on the inputNDArray
.static NDList
Applies ReLU6 activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies theReLU6
activation function in its forward function.static Block
Creates aLambdaBlock
that applies theReLU
activation function in its forward function.static NDArray
Applies Scaled ELU activation on the inputNDArray
.static NDList
Applies Scaled ELU activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies theSELU
activation function in its forward function.static NDArray
Applies Sigmoid activation on the inputNDArray
.static NDList
Applies Sigmoid activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies theSigmoid
activation function in its forward function.static NDArray
Applies softPlus activation on the inputNDArray
.static NDList
Applies softPlus activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies thesoftPlus(NDList)
activation function in its forward function.static NDArray
Applies softSign activation on the inputNDArray
.static NDList
Applies softPlus activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies thesoftSign(NDList)
activation function in its forward function.static NDArray
Applies Swish activation on the inputNDArray
.static NDList
Applies SWish activation on the input singletonNDList
.static Block
swishBlock
(float beta) Creates aLambdaBlock
that applies theSwish
activation function in its forward function.static NDArray
Applies Tanh activation on the inputNDArray
.static NDList
Applies Tanh activation on the input singletonNDList
.static Block
Creates aLambdaBlock
that applies theTanh
activation function in its forward function.
-
Method Details
-
relu
Applies ReLU activation on the inputNDArray
.ReLU is defined by: \( y = max(0, x) \)
-
relu
Applies ReLU activation on the input singletonNDList
.ReLU is defined by: \( y = max(0, x) \)
-
relu6
Applies ReLU6 activation on the inputNDArray
.ReLU6 is defined by: \( y = min(6,max(0, x)) \)
-
relu6
Applies ReLU6 activation on the input singletonNDList
.ReLU6 is defined by: \( y = min(6,max(0, x)) \)
-
sigmoid
Applies Sigmoid activation on the inputNDArray
.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
sigmoid
Applies Sigmoid activation on the input singletonNDList
.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
tanh
Applies Tanh activation on the inputNDArray
.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
tanh
Applies Tanh activation on the input singletonNDList
.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
softPlus
Applies softPlus activation on the inputNDArray
.softPlus is defined by: \( y = log(1 + e^x) \)
-
softPlus
Applies softPlus activation on the input singletonNDList
.softPlus is defined by: \( y = log(1 + e^x) \)
-
softSign
Applies softSign activation on the inputNDArray
.softPlus is defined by: \( y = x / 1 + |x| \)
-
softSign
Applies softPlus activation on the input singletonNDList
.softPlus is defined by: \( y = x / 1 + |x| \)
-
leakyRelu
Applies Leaky ReLU activation on the inputNDArray
.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
leakyRelu
Applies Leaky ReLU activation on the input singletonNDList
.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
elu
Applies ELU activation on the inputNDArray
.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
elu
Applies ELU(Exponential Linear Unit) activation on the input singletonNDList
.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
selu
Applies Scaled ELU activation on the inputNDArray
.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717\)
-
selu
Applies Scaled ELU activation on the input singletonNDList
.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717 \)
-
gelu
Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray
. -
gelu
Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList
. -
swish
Applies Swish activation on the inputNDArray
.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
swish
Applies SWish activation on the input singletonNDList
.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
mish
Applies Mish activation on the inputNDArray
.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
mish
Applies Mish activation on the input singletonNDList
.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
reluBlock
Creates aLambdaBlock
that applies theReLU
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theReLU
activation function
-
relu6Block
Creates aLambdaBlock
that applies theReLU6
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theReLU
activation function
-
sigmoidBlock
Creates aLambdaBlock
that applies theSigmoid
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theSigmoid
activation function
-
tanhBlock
Creates aLambdaBlock
that applies theTanh
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theTanh
activation function
-
softPlusBlock
Creates aLambdaBlock
that applies thesoftPlus(NDList)
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies thesoftPlus(NDList)
activation function
-
softSignBlock
Creates aLambdaBlock
that applies thesoftSign(NDList)
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies thesoftSign(NDList)
activation function
-
leakyReluBlock
Creates aLambdaBlock
that applies theLeakyReLU
activation function in its forward function.- Parameters:
alpha
- the slope for the activation- Returns:
- the
LambdaBlock
that applies theLeakyReLU
activation function
-
eluBlock
Creates aLambdaBlock
that applies theELU
activation function in its forward function.- Parameters:
alpha
- the slope for the activation- Returns:
- the
LambdaBlock
that applies theELU
activation function
-
seluBlock
Creates aLambdaBlock
that applies theSELU
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theSELU
activation function
-
geluBlock
Creates aLambdaBlock
that applies theGELU
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theGELU
activation function
-
swishBlock
Creates aLambdaBlock
that applies theSwish
activation function in its forward function.- Parameters:
beta
- a hyper-parameter- Returns:
- the
LambdaBlock
that applies theSwish
activation function
-
mishBlock
Creates aLambdaBlock
that applies theMish
activation function in its forward function.- Returns:
- the
LambdaBlock
that applies theMish
activation function
-
preluBlock
Returns aPrelu
block.- Returns:
- a
Prelu
block
-