Interface DropoutHelper
-
- All Superinterfaces:
LayerHelper
public interface DropoutHelper extends LayerHelper
-
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description void
applyDropout(INDArray inputActivations, INDArray resultArray, double dropoutInputRetainProb)
Apply the dropout during forward passvoid
backprop(INDArray gradAtOutput, INDArray gradAtInput)
Perform backpropagation.boolean
checkSupported()
-
Methods inherited from interface org.deeplearning4j.nn.layers.LayerHelper
helperMemoryUse
-
-
-
-
Method Detail
-
checkSupported
boolean checkSupported()
- Specified by:
checkSupported
in interfaceLayerHelper
- Returns:
- Check if this dropout helper is supported in the current environment
-
applyDropout
void applyDropout(INDArray inputActivations, INDArray resultArray, double dropoutInputRetainProb)
Apply the dropout during forward pass- Parameters:
inputActivations
- Input activations (pre dropout)resultArray
- Output activations (post dropout). May be same as (or different to) input arraydropoutInputRetainProb
- Probability of retaining an activation
-
backprop
void backprop(INDArray gradAtOutput, INDArray gradAtInput)
Perform backpropagation. Note that the same dropout mask should be used for backprop as was used during the last call toapplyDropout(INDArray, INDArray, double)
- Parameters:
gradAtOutput
- Gradient at output (from perspective of forward pass)gradAtInput
- Result array - gradient at input. May be same as (or different to) gradient at input
-
-