Class Dropout
- java.lang.Object
-
- org.deeplearning4j.nn.conf.dropout.Dropout
-
- All Implemented Interfaces:
Serializable
,Cloneable
,IDropout
public class Dropout extends Object implements IDropout
- See Also:
- Serialized Form
-
-
Field Summary
Fields Modifier and Type Field Description static String
CUDNN_DROPOUT_HELPER_CLASS_NAME
protected boolean
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed? If set to false, an exception in CuDNN will be propagated back to the user.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description INDArray
applyDropout(INDArray inputActivations, INDArray output, int iteration, int epoch, LayerWorkspaceMgr workspaceMgr)
INDArray
backprop(INDArray gradAtOutput, INDArray gradAtInput, int iteration, int epoch)
Perform backprop.void
clear()
Clear the internal state (for example, dropout mask) if any is presentDropout
clone()
Dropout
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed? If set to false, an exception in the helper will be propagated back to the user.protected void
initializeHelper(DataType dataType)
Initialize the CuDNN dropout helper, if possible
-
-
-
Field Detail
-
helperAllowFallback
protected boolean helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed? If set to false, an exception in CuDNN will be propagated back to the user. If false, the built-in (non-CuDNN) implementation for LSTM/GravesLSTM will be used
-
CUDNN_DROPOUT_HELPER_CLASS_NAME
public static final String CUDNN_DROPOUT_HELPER_CLASS_NAME
- See Also:
- Constant Field Values
-
-
Constructor Detail
-
Dropout
public Dropout(double activationRetainProbability)
- Parameters:
activationRetainProbability
- Probability of retaining an activation - seeDropout
javadoc
-
Dropout
public Dropout(ISchedule activationRetainProbabilitySchedule)
- Parameters:
activationRetainProbabilitySchedule
- Schedule for probability of retaining an activation - seeDropout
javadoc
-
Dropout
protected Dropout(double activationRetainProbability, ISchedule activationRetainProbabilitySchedule)
-
-
Method Detail
-
helperAllowFallback
public Dropout helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed? If set to false, an exception in the helper will be propagated back to the user. If false, the built-in (non-helper) implementation for Dropout will be used- Parameters:
allowFallback
- Whether fallback to non-helper implementation should be used
-
initializeHelper
protected void initializeHelper(DataType dataType)
Initialize the CuDNN dropout helper, if possible
-
applyDropout
public INDArray applyDropout(INDArray inputActivations, INDArray output, int iteration, int epoch, LayerWorkspaceMgr workspaceMgr)
- Specified by:
applyDropout
in interfaceIDropout
- Parameters:
inputActivations
- Input activations arrayoutput
- The result array (same as inputArray for in-place ops) for the post-dropout activationsiteration
- Current iteration numberepoch
- Current epoch numberworkspaceMgr
- Workspace manager, if any storage is required (use ArrayType.INPUT)- Returns:
- The output (resultArray) after applying dropout
-
backprop
public INDArray backprop(INDArray gradAtOutput, INDArray gradAtInput, int iteration, int epoch)
Description copied from interface:IDropout
Perform backprop. This should also clear the internal state (dropout mask) if any is present- Specified by:
backprop
in interfaceIDropout
- Parameters:
gradAtOutput
- Gradients at the output of the dropout op - i.e., dL/dOutgradAtInput
- Gradients at the input of the dropout op - i.e., dL/dIn. Use the same array as gradAtOutput to apply the backprop gradient in-placeiteration
- Current iterationepoch
- Current epoch- Returns:
- Same array as gradAtInput - i.e., gradient after backpropagating through dropout op - i.e., dL/dIn
-
clear
public void clear()
Description copied from interface:IDropout
Clear the internal state (for example, dropout mask) if any is present
-
-