Package org.deeplearning4j.gradientcheck
Class GradientCheckUtil
- java.lang.Object
-
- org.deeplearning4j.gradientcheck.GradientCheckUtil
-
public class GradientCheckUtil extends Object
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static class
GradientCheckUtil.GraphConfig
static class
GradientCheckUtil.MLNConfig
static class
GradientCheckUtil.PrintMode
-
Method Summary
All Methods Static Methods Concrete Methods Deprecated Methods Modifier and Type Method Description static boolean
checkGradients(GradientCheckUtil.GraphConfig c)
static boolean
checkGradients(GradientCheckUtil.MLNConfig c)
static boolean
checkGradients(MultiLayerNetwork mln, double epsilon, double maxRelError, double minAbsoluteError, boolean print, boolean exitOnFirstError, INDArray input, INDArray labels)
Deprecated.static boolean
checkGradients(MultiLayerNetwork mln, double epsilon, double maxRelError, double minAbsoluteError, boolean print, boolean exitOnFirstError, INDArray input, INDArray labels, INDArray inputMask, INDArray labelMask, boolean subset, int maxPerParam, Set<String> excludeParams, Integer rngSeedResetEachIter)
Deprecated.static boolean
checkGradientsPretrainLayer(Layer layer, double epsilon, double maxRelError, double minAbsoluteError, boolean print, boolean exitOnFirstError, INDArray input, int rngSeed)
Check backprop gradients for a pretrain layer NOTE: gradient checking pretrain layers can be difficult...
-
-
-
Method Detail
-
checkGradients
@Deprecated public static boolean checkGradients(MultiLayerNetwork mln, double epsilon, double maxRelError, double minAbsoluteError, boolean print, boolean exitOnFirstError, INDArray input, INDArray labels)
Deprecated.Check backprop gradients for a MultiLayerNetwork.- Parameters:
mln
- MultiLayerNetwork to test. This must be initialized.epsilon
- Usually on the order/ of 1e-4 or so.maxRelError
- Maximum relative error. Usually < 1e-5 or so, though maybe more for deep networks or those with nonlinear activationminAbsoluteError
- Minimum absolute error to cause a failure. Numerical gradients can be non-zero due to precision issues. For example, 0.0 vs. 1e-18: relative error is 1.0, but not really a failureprint
- Whether to print full pass/failure details for each parameter gradientexitOnFirstError
- If true: return upon first failure. If false: continue checking even if one parameter gradient has failed. Typically use false for debugging, true for unit tests.input
- Input array to use for forward pass. May be mini-batch data.labels
- Labels/targets to use to calculate backprop gradient. May be mini-batch data.- Returns:
- true if gradients are passed, false otherwise.
-
checkGradients
@Deprecated public static boolean checkGradients(MultiLayerNetwork mln, double epsilon, double maxRelError, double minAbsoluteError, boolean print, boolean exitOnFirstError, INDArray input, INDArray labels, INDArray inputMask, INDArray labelMask, boolean subset, int maxPerParam, Set<String> excludeParams, Integer rngSeedResetEachIter)
Deprecated.
-
checkGradients
public static boolean checkGradients(GradientCheckUtil.MLNConfig c)
-
checkGradients
public static boolean checkGradients(GradientCheckUtil.GraphConfig c)
-
checkGradientsPretrainLayer
public static boolean checkGradientsPretrainLayer(Layer layer, double epsilon, double maxRelError, double minAbsoluteError, boolean print, boolean exitOnFirstError, INDArray input, int rngSeed)
Check backprop gradients for a pretrain layer NOTE: gradient checking pretrain layers can be difficult...
-
-