public class ActivationRReLU extends BaseActivationFunction
Modifier and Type | Field and Description |
---|---|
static double |
DEFAULT_L |
static double |
DEFAULT_U |
Constructor and Description |
---|
ActivationRReLU() |
ActivationRReLU(double l,
double u) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.primitives.Pair<INDArray,INDArray> |
backprop(INDArray in,
INDArray epsilon)
Backpropagate the errors through the activation function, given input z and epsilon dL/da.
Returns 2 INDArrays: (a) The gradient dL/dz, calculated from dL/da, and (b) The parameter gradients dL/dw, where w is the weights in the activation function. |
INDArray |
getActivation(INDArray in,
boolean training)
Carry out activation function on the input array (usually known as 'preOut' or 'z')
Implementations must overwrite "in", transform in place and return "in"
Can support separate behaviour during test
|
String |
toString() |
getGradientViewArray, getParametersViewArray, numParams, setGradientViewArray, setParametersViewArray
public static final double DEFAULT_L
public static final double DEFAULT_U
public ActivationRReLU()
public ActivationRReLU(double l, double u)
public INDArray getActivation(INDArray in, boolean training)
IActivation
public org.nd4j.linalg.primitives.Pair<INDArray,INDArray> backprop(INDArray in, INDArray epsilon)
IActivation
in
- Input, before applying the activation function (z, or 'preOut')epsilon
- Gradient to be backpropagated: dL/da, where L is the loss functionCopyright © 2017. All rights reserved.