Interface Model
- All Superinterfaces:
 AutoCloseable
A deep learning model usually contains the following parts:
- the 
Blockof operations to run - the 
Parameters that are trained - Input/Output information: input and output parameter names, shape, etc.
 - Other artifacts such as a synset for classification that would be used during pre-processing and post-processing
 
For loading a pre-trained model, see load(Path, String)
 
For training a model, see Trainer.
 
For running inference with a model, see Predictor.
- 
Method Summary
Modifier and TypeMethodDescriptiondefault voidCasts the model to support a different precision level.voidclose()Returns the input descriptor of the model.Returns the output descriptor of the model.getArtifact(String name) Finds an artifact resource with a given name in the model.<T> TgetArtifact(String name, Function<InputStream, T> function) Attempts to load the artifact using the given function and cache it if the specified artifact is not already cached.getArtifactAsStream(String name) Finds an artifact resource with a given name in the model.String[]Returns the artifact names associated with the model.getBlock()Gets the block from the Model.Returns the standard data type used within the model.Returns the directory from where the model is loaded.getName()Gets the model name.Gets theNDManagerfrom the model.Returns the model's properties.getProperty(String key) Returns the property of the model based on property name.default StringgetProperty(String key, String defValue) Returns the property of the model based on property name.default intintProperty(String key, int defValue) Returns the property of the model based on property name.default voidload(InputStream is) Loads the model from theInputStream.voidload(InputStream is, Map<String, ?> options) Loads the model from theInputStreamwith the options provided.default voidLoads the model from themodelPath.default voidLoads the model from themodelPathand the given name.voidLoads the model from themodelPathwith the name and options provided.default longlongProperty(String key, long defValue) Returns the property of the model based on property name.static ModelnewInstance(String name) Creates an empty model instance.static ModelnewInstance(String name, Device device) Creates an empty model instance on the specifiedDevice.static ModelnewInstance(String name, Device device, String engineName) Creates an empty model instance on the specifiedDeviceand engine.static ModelnewInstance(String name, String engineName) Creates an empty model instance on the specifiedDeviceand engine.default <I,O> Predictor<I, O> newPredictor(Translator<I, O> translator) Creates a new Predictor based on the model on the current device.<I,O> Predictor<I, O> newPredictor(Translator<I, O> translator, Device device) Creates a new Predictor based on the model.newTrainer(TrainingConfig trainingConfig) Creates a newTrainerinstance for a Model.default voidquantize()Converts the model to use a lower precision quantized network.voidSaves the model to the specifiedmodelPathwith the name provided.voidSets the block for the Model for training and inference.voidsetDataType(DataType dataType) Sets the standard data type used within the model.voidsetProperty(String key, String value) Sets a property to the model. 
- 
Method Details
- 
newInstance
Creates an empty model instance.- Parameters:
 name- the model name- Returns:
 - a new Model instance
 
 - 
newInstance
Creates an empty model instance on the specifiedDevice.- Parameters:
 name- the model namedevice- the device to load the model onto- Returns:
 - a new model instance
 
 - 
newInstance
Creates an empty model instance on the specifiedDeviceand engine.- Parameters:
 name- the model nameengineName- the name of the engine- Returns:
 - a new model instance
 
 - 
newInstance
Creates an empty model instance on the specifiedDeviceand engine.- Parameters:
 name- the model namedevice- the device to load the model ontoengineName- the name of the engine- Returns:
 - a new model instance
 
 - 
load
Loads the model from themodelPath.- Parameters:
 modelPath- the directory or file path of the model location- Throws:
 IOException- when IO operation fails in loading a resourceMalformedModelException- if model file is corrupted
 - 
load
Loads the model from themodelPathand the given name.- Parameters:
 modelPath- the directory or file path of the model locationprefix- the model file name or path prefix- Throws:
 IOException- when IO operation fails in loading a resourceMalformedModelException- if model file is corrupted
 - 
load
void load(Path modelPath, String prefix, Map<String, ?> options) throws IOException, MalformedModelExceptionLoads the model from themodelPathwith the name and options provided.- Parameters:
 modelPath- the directory or file path of the model locationprefix- the model file name or path prefixoptions- engine specific load model options, see documentation for each engine- Throws:
 IOException- when IO operation fails in loading a resourceMalformedModelException- if model file is corrupted
 - 
load
Loads the model from theInputStream.- Parameters:
 is- theInputStreamto load the model from- Throws:
 IOException- when IO operation fails in loading a resourceMalformedModelException- if model file is corrupted
 - 
load
Loads the model from theInputStreamwith the options provided.- Parameters:
 is- theInputStreamto load the model fromoptions- engine specific load model options, see documentation for each engine- Throws:
 IOException- when IO operation fails in loading a resourceMalformedModelException- if model file is corrupted
 - 
save
Saves the model to the specifiedmodelPathwith the name provided.- Parameters:
 modelPath- the directory or file path of the model locationnewModelName- the new model name to be saved, use null to keep original model name- Throws:
 IOException- when IO operation fails in loading a resource
 - 
getModelPath
Path getModelPath()Returns the directory from where the model is loaded.- Returns:
 - the directory of the model location
 
 - 
getBlock
Block getBlock()Gets the block from the Model.- Returns:
 - the 
Block 
 - 
setBlock
Sets the block for the Model for training and inference.- Parameters:
 block- theBlockused in Model
 - 
getName
String getName()Gets the model name.- Returns:
 - name of the model
 
 - 
getProperties
Returns the model's properties.- Returns:
 - the model's properties
 
 - 
getProperty
Returns the property of the model based on property name.- Parameters:
 key- the name of the property- Returns:
 - the value of the property
 
 - 
getProperty
Returns the property of the model based on property name.- Parameters:
 key- the name of the propertydefValue- the default value if key not found- Returns:
 - the value of the property
 
 - 
setProperty
Sets a property to the model.properties will be saved/loaded with model, user can store some information about the model in here.
- Parameters:
 key- the name of the propertyvalue- the value of the property
 - 
intProperty
Returns the property of the model based on property name.- Parameters:
 key- the name of the propertydefValue- the default value if key not found- Returns:
 - the value of the property
 
 - 
longProperty
Returns the property of the model based on property name.- Parameters:
 key- the name of the propertydefValue- the default value if key not found- Returns:
 - the value of the property
 
 - 
getNDManager
NDManager getNDManager()Gets theNDManagerfrom the model.- Returns:
 - the 
NDManager 
 - 
newTrainer
Creates a newTrainerinstance for a Model.- Parameters:
 trainingConfig- training configuration settings- Returns:
 - the 
Trainerinstance 
 - 
newPredictor
Creates a new Predictor based on the model on the current device.- Type Parameters:
 I- the input object for pre-processingO- the output object from postprocessing- Parameters:
 translator- the object used for pre-processing and postprocessing- Returns:
 - an instance of 
Predictor 
 - 
newPredictor
Creates a new Predictor based on the model.- Type Parameters:
 I- the input object for pre-processingO- the output object from postprocessing- Parameters:
 translator- the object used for pre-processing and postprocessingdevice- the device to use for prediction- Returns:
 - an instance of 
Predictor 
 - 
describeInput
Returns the input descriptor of the model.It contains the information that can be extracted from the model, usually name, shape, layout and DataType.
- Returns:
 - a PairList of String and Shape
 
 - 
describeOutput
Returns the output descriptor of the model.It contains the output information that can be obtained from the model.
- Returns:
 - a PairList of String and Shape
 
 - 
getArtifactNames
String[] getArtifactNames()Returns the artifact names associated with the model.- Returns:
 - an array of artifact names
 
 - 
getArtifact
Attempts to load the artifact using the given function and cache it if the specified artifact is not already cached.Model will cache loaded artifact, so the user doesn't need to keep tracking it.
String synset = model.getArtifact("synset.txt", k -> IOUtils.toString(k)));- Type Parameters:
 T- the type of the returned artifact object- Parameters:
 name- the name of the desired artifactfunction- the function to load the artifact- Returns:
 - the current (existing or computed) artifact associated with the specified name, or null if the computed value is null
 - Throws:
 IOException- when IO operation fails in loading a resourceClassCastException- if the cached artifact cannot be cast to the target class
 - 
getArtifact
Finds an artifact resource with a given name in the model.- Parameters:
 name- the name of the desired artifact- Returns:
 - a 
URLobject ornullif no artifact with this name is found - Throws:
 IOException- when IO operation fails in loading a resource
 - 
getArtifactAsStream
Finds an artifact resource with a given name in the model.- Parameters:
 name- the name of the desired artifact- Returns:
 - a 
InputStreamobject ornullif no resource with this name is found - Throws:
 IOException- when IO operation fails in loading a resource
 - 
setDataType
Sets the standard data type used within the model.- Parameters:
 dataType- the standard data type to use
 - 
getDataType
DataType getDataType()Returns the standard data type used within the model.- Returns:
 - the standard data type used within the model
 
 - 
cast
Casts the model to support a different precision level.For example, you can cast the precision from Float to Int
- Parameters:
 dataType- the target dataType you would like to cast to
 - 
quantize
default void quantize()Converts the model to use a lower precision quantized network.Quantization converts the network to use int8 data type where possible for smaller model size and faster computation without too large a drop in accuracy. See original paper.
 - 
close
void close()- Specified by:
 closein interfaceAutoCloseable
 
 -