whether to cache the Samples after preprocessing.
whether to cache the Samples after preprocessing. Default: true
Clear clipping params, in this case, clipping will not be applied.
Clear clipping params, in this case, clipping will not be applied.
Constant gradient clipping thresholds.
Constant gradient clipping thresholds.
Return a deep copy for DLEstimator.
Return a deep copy for DLEstimator. Note that trainSummary and validationSummary will not be copied to the new instance since currently they are not thread-safe.
BigDL criterion method
BigDL criterion method
When to stop the training, passed in a Trigger.
When to stop the training, passed in a Trigger. E.g. Trigger.maxIterations
get the validate configuration during training
get the validate configuration during training
an Option of Tuple(ValidationTrigger, Validation data, Array[ValidationMethod[T] ], batchsize)
Statistics (LearningRate, Loss, Throughput, Parameters) collected during training for the validation data if validation data is set, which can be used for visualization via Tensorboard.
Statistics (LearningRate, Loss, Throughput, Parameters) collected during training for the validation data if validation data is set, which can be used for visualization via Tensorboard. Use setValidationSummary to enable validation logger. Then the log will be saved to logDir/appName/ as specified by the parameters of validationSummary.
Default: None
L2 norm gradient clipping threshold.
L2 norm gradient clipping threshold.
learning rate for the optimizer in the NNEstimator.
learning rate for the optimizer in the NNEstimator. Default: 0.001
learning rate decay for each iteration.
learning rate decay for each iteration. Default: 0
Number of max Epoch for the training, an epoch refers to a traverse over the training data Default: 50
Number of max Epoch for the training, an epoch refers to a traverse over the training data Default: 50
BigDL module to be optimized
BigDL module to be optimized
optimization method to be used.
optimization method to be used. BigDL supports many optimization methods like Adam, SGD and LBFGS. Refer to package com.intel.analytics.bigdl.optim for all the options. Default: SGD
Statistics (LearningRate, Loss, Throughput, Parameters) collected during training for the training data, which can be used for visualization via Tensorboard.
Statistics (LearningRate, Loss, Throughput, Parameters) collected during training for the training data, which can be used for visualization via Tensorboard. Use setTrainSummary to enable train logger. Then the log will be saved to logDir/appName/train as specified by the parameters of TrainSummary.
Default: Not enabled
Set a validate evaluation during training
Set a validate evaluation during training
how often to evaluation validation set
validate data set
a set of validation method ValidationMethod
batch size for validation
this optimizer
Enable validation Summary
Enable validation Summary
sub classes can extend the method and return required model for different transform tasks
sub classes can extend the method and return required model for different transform tasks
NNClassifier is a specialized NNEstimator that simplifies the data format for classification tasks. It explicitly supports label column of DoubleType. and the fitted NNClassifierModel will have the prediction column of DoubleType.