Global batch size across the cluster.
Global batch size across the cluster.
Perform a prediction on featureCol, and write result to the predictionCol.
Perform a prediction on featureCol, and write result to the predictionCol.
trained BigDL models to use in prediction.
Set global batch size across the cluster.
Set global batch size across the cluster. Global batch size = Batch per thread * num of cores.
set Preprocessing.
NNModel extends Spark ML Transformer and supports BigDL model with Spark DataFrame data.
NNModel supports different feature data type through Preprocessing. We provide pre-defined Preprocessing for popular data types like Array or Vector in package com.intel.analytics.zoo.feature, while user can also develop customized Preprocessing. During transform, NNModel will extract feature data from input DataFrame and use the Preprocessing to prepare data for the model.
After transform, the prediction column contains the output of the model as Array[T], where T (Double or Float) is decided by the model type.