Class OutputConfig
- java.lang.Object
-
- software.amazon.awssdk.services.sagemaker.model.OutputConfig
-
- All Implemented Interfaces:
Serializable,SdkPojo,ToCopyableBuilder<OutputConfig.Builder,OutputConfig>
@Generated("software.amazon.awssdk:codegen") public final class OutputConfig extends Object implements SdkPojo, Serializable, ToCopyableBuilder<OutputConfig.Builder,OutputConfig>
Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform.- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static interfaceOutputConfig.Builder
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static OutputConfig.Builderbuilder()StringcompilerOptions()Specifies additional parameters for compiler options in JSON format.booleanequals(Object obj)booleanequalsBySdkFields(Object obj)<T> Optional<T>getValueForField(String fieldName, Class<T> clazz)inthashCode()StringkmsKeyId()The Amazon Web Services Key Management Service key (Amazon Web Services KMS) that Amazon SageMaker AI uses to encrypt your output models with Amazon S3 server-side encryption after compilation job.Strings3OutputLocation()Identifies the S3 bucket where you want Amazon SageMaker AI to store the model artifacts.Map<String,SdkField<?>>sdkFieldNameToField()List<SdkField<?>>sdkFields()static Class<? extends OutputConfig.Builder>serializableBuilderClass()TargetDevicetargetDevice()Identifies the target device or the machine learning instance that you want to run your model on after the compilation has completed.StringtargetDeviceAsString()Identifies the target device or the machine learning instance that you want to run your model on after the compilation has completed.TargetPlatformtargetPlatform()Contains information about a target platform that you want your model to run on, such as OS, architecture, and accelerators.OutputConfig.BuildertoBuilder()StringtoString()Returns a string representation of this object.-
Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, wait, wait, wait
-
Methods inherited from interface software.amazon.awssdk.utils.builder.ToCopyableBuilder
copy
-
-
-
-
Method Detail
-
s3OutputLocation
public final String s3OutputLocation()
Identifies the S3 bucket where you want Amazon SageMaker AI to store the model artifacts. For example,
s3://bucket-name/key-name-prefix.- Returns:
- Identifies the S3 bucket where you want Amazon SageMaker AI to store the model artifacts. For example,
s3://bucket-name/key-name-prefix.
-
targetDevice
public final TargetDevice targetDevice()
Identifies the target device or the machine learning instance that you want to run your model on after the compilation has completed. Alternatively, you can specify OS, architecture, and accelerator using TargetPlatform fields. It can be used instead of
TargetPlatform.Currently
ml_trn1is available only in US East (N. Virginia) Region, andml_inf2is available only in US East (Ohio) Region.If the service returns an enum value that is not available in the current SDK version,
targetDevicewill returnTargetDevice.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available fromtargetDeviceAsString().- Returns:
- Identifies the target device or the machine learning instance that you want to run your model on after
the compilation has completed. Alternatively, you can specify OS, architecture, and accelerator using TargetPlatform
fields. It can be used instead of
TargetPlatform.Currently
ml_trn1is available only in US East (N. Virginia) Region, andml_inf2is available only in US East (Ohio) Region. - See Also:
TargetDevice
-
targetDeviceAsString
public final String targetDeviceAsString()
Identifies the target device or the machine learning instance that you want to run your model on after the compilation has completed. Alternatively, you can specify OS, architecture, and accelerator using TargetPlatform fields. It can be used instead of
TargetPlatform.Currently
ml_trn1is available only in US East (N. Virginia) Region, andml_inf2is available only in US East (Ohio) Region.If the service returns an enum value that is not available in the current SDK version,
targetDevicewill returnTargetDevice.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available fromtargetDeviceAsString().- Returns:
- Identifies the target device or the machine learning instance that you want to run your model on after
the compilation has completed. Alternatively, you can specify OS, architecture, and accelerator using TargetPlatform
fields. It can be used instead of
TargetPlatform.Currently
ml_trn1is available only in US East (N. Virginia) Region, andml_inf2is available only in US East (Ohio) Region. - See Also:
TargetDevice
-
targetPlatform
public final TargetPlatform targetPlatform()
Contains information about a target platform that you want your model to run on, such as OS, architecture, and accelerators. It is an alternative of
TargetDevice.The following examples show how to configure the
TargetPlatformandCompilerOptionsJSON strings for popular target platforms:-
Raspberry Pi 3 Model B+
"TargetPlatform": {"Os": "LINUX", "Arch": "ARM_EABIHF"},"CompilerOptions": {'mattr': ['+neon']} -
Jetson TX2
"TargetPlatform": {"Os": "LINUX", "Arch": "ARM64", "Accelerator": "NVIDIA"},"CompilerOptions": {'gpu-code': 'sm_62', 'trt-ver': '6.0.1', 'cuda-ver': '10.0'} -
EC2 m5.2xlarge instance OS
"TargetPlatform": {"Os": "LINUX", "Arch": "X86_64", "Accelerator": "NVIDIA"},"CompilerOptions": {'mcpu': 'skylake-avx512'} -
RK3399
"TargetPlatform": {"Os": "LINUX", "Arch": "ARM64", "Accelerator": "MALI"} -
ARMv7 phone (CPU)
"TargetPlatform": {"Os": "ANDROID", "Arch": "ARM_EABI"},"CompilerOptions": {'ANDROID_PLATFORM': 25, 'mattr': ['+neon']} -
ARMv8 phone (CPU)
"TargetPlatform": {"Os": "ANDROID", "Arch": "ARM64"},"CompilerOptions": {'ANDROID_PLATFORM': 29}
- Returns:
- Contains information about a target platform that you want your model to run on, such as OS,
architecture, and accelerators. It is an alternative of
TargetDevice.The following examples show how to configure the
TargetPlatformandCompilerOptionsJSON strings for popular target platforms:-
Raspberry Pi 3 Model B+
"TargetPlatform": {"Os": "LINUX", "Arch": "ARM_EABIHF"},"CompilerOptions": {'mattr': ['+neon']} -
Jetson TX2
"TargetPlatform": {"Os": "LINUX", "Arch": "ARM64", "Accelerator": "NVIDIA"},"CompilerOptions": {'gpu-code': 'sm_62', 'trt-ver': '6.0.1', 'cuda-ver': '10.0'} -
EC2 m5.2xlarge instance OS
"TargetPlatform": {"Os": "LINUX", "Arch": "X86_64", "Accelerator": "NVIDIA"},"CompilerOptions": {'mcpu': 'skylake-avx512'} -
RK3399
"TargetPlatform": {"Os": "LINUX", "Arch": "ARM64", "Accelerator": "MALI"} -
ARMv7 phone (CPU)
"TargetPlatform": {"Os": "ANDROID", "Arch": "ARM_EABI"},"CompilerOptions": {'ANDROID_PLATFORM': 25, 'mattr': ['+neon']} -
ARMv8 phone (CPU)
"TargetPlatform": {"Os": "ANDROID", "Arch": "ARM64"},"CompilerOptions": {'ANDROID_PLATFORM': 29}
-
-
-
compilerOptions
public final String compilerOptions()
Specifies additional parameters for compiler options in JSON format. The compiler options are
TargetPlatformspecific. It is required for NVIDIA accelerators and highly recommended for CPU compilations. For any other cases, it is optional to specifyCompilerOptions.-
DTYPE: Specifies the data type for the input. When compiling forml_*(except forml_inf) instances using PyTorch framework, provide the data type (dtype) of the model's input."float32"is used if"DTYPE"is not specified. Options for data type are:-
float32: Use either
"float"or"float32". -
int64: Use either
"int64"or"long".
For example,
{"dtype" : "float32"}. -
-
CPU: Compilation for CPU supports the following compiler options.-
mcpu: CPU micro-architecture. For example,{'mcpu': 'skylake-avx512'} -
mattr: CPU flags. For example,{'mattr': ['+neon', '+vfpv4']}
-
-
ARM: Details of ARM CPU compilations.-
NEON: NEON is an implementation of the Advanced SIMD extension used in ARMv7 processors.For example, add
{'mattr': ['+neon']}to the compiler options if compiling for ARM 32-bit platform with the NEON support.
-
-
NVIDIA: Compilation for NVIDIA GPU supports the following compiler options.-
gpu_code: Specifies the targeted architecture. -
trt-ver: Specifies the TensorRT versions in x.y.z. format. -
cuda-ver: Specifies the CUDA version in x.y format.
For example,
{'gpu-code': 'sm_72', 'trt-ver': '6.0.1', 'cuda-ver': '10.1'} -
-
ANDROID: Compilation for the Android OS supports the following compiler options:-
ANDROID_PLATFORM: Specifies the Android API levels. Available levels range from 21 to 29. For example,{'ANDROID_PLATFORM': 28}. -
mattr: Add{'mattr': ['+neon']}to compiler options if compiling for ARM 32-bit platform with NEON support.
-
-
INFERENTIA: Compilation for target ml_inf1 uses compiler options passed in as a JSON string. For example,"CompilerOptions": "\"--verbose 1 --num-neuroncores 2 -O2\"".For information about supported compiler options, see Neuron Compiler CLI Reference Guide.
-
CoreML: Compilation for the CoreML OutputConfigTargetDevicesupports the following compiler options:-
class_labels: Specifies the classification labels file name inside input tar.gz file. For example,{"class_labels": "imagenet_labels_1000.txt"}. Labels inside the txt file should be separated by newlines.
-
- Returns:
- Specifies additional parameters for compiler options in JSON format. The compiler options are
TargetPlatformspecific. It is required for NVIDIA accelerators and highly recommended for CPU compilations. For any other cases, it is optional to specifyCompilerOptions.-
DTYPE: Specifies the data type for the input. When compiling forml_*(except forml_inf) instances using PyTorch framework, provide the data type (dtype) of the model's input."float32"is used if"DTYPE"is not specified. Options for data type are:-
float32: Use either
"float"or"float32". -
int64: Use either
"int64"or"long".
For example,
{"dtype" : "float32"}. -
-
CPU: Compilation for CPU supports the following compiler options.-
mcpu: CPU micro-architecture. For example,{'mcpu': 'skylake-avx512'} -
mattr: CPU flags. For example,{'mattr': ['+neon', '+vfpv4']}
-
-
ARM: Details of ARM CPU compilations.-
NEON: NEON is an implementation of the Advanced SIMD extension used in ARMv7 processors.For example, add
{'mattr': ['+neon']}to the compiler options if compiling for ARM 32-bit platform with the NEON support.
-
-
NVIDIA: Compilation for NVIDIA GPU supports the following compiler options.-
gpu_code: Specifies the targeted architecture. -
trt-ver: Specifies the TensorRT versions in x.y.z. format. -
cuda-ver: Specifies the CUDA version in x.y format.
For example,
{'gpu-code': 'sm_72', 'trt-ver': '6.0.1', 'cuda-ver': '10.1'} -
-
ANDROID: Compilation for the Android OS supports the following compiler options:-
ANDROID_PLATFORM: Specifies the Android API levels. Available levels range from 21 to 29. For example,{'ANDROID_PLATFORM': 28}. -
mattr: Add{'mattr': ['+neon']}to compiler options if compiling for ARM 32-bit platform with NEON support.
-
-
INFERENTIA: Compilation for target ml_inf1 uses compiler options passed in as a JSON string. For example,"CompilerOptions": "\"--verbose 1 --num-neuroncores 2 -O2\"".For information about supported compiler options, see Neuron Compiler CLI Reference Guide.
-
CoreML: Compilation for the CoreML OutputConfigTargetDevicesupports the following compiler options:-
class_labels: Specifies the classification labels file name inside input tar.gz file. For example,{"class_labels": "imagenet_labels_1000.txt"}. Labels inside the txt file should be separated by newlines.
-
-
-
-
kmsKeyId
public final String kmsKeyId()
The Amazon Web Services Key Management Service key (Amazon Web Services KMS) that Amazon SageMaker AI uses to encrypt your output models with Amazon S3 server-side encryption after compilation job. If you don't provide a KMS key ID, Amazon SageMaker AI uses the default KMS key for Amazon S3 for your role's account. For more information, see KMS-Managed Encryption Keys in the Amazon Simple Storage Service Developer Guide.
The KmsKeyId can be any of the following formats:
-
Key ID:
1234abcd-12ab-34cd-56ef-1234567890ab -
Key ARN:
arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab -
Alias name:
alias/ExampleAlias -
Alias name ARN:
arn:aws:kms:us-west-2:111122223333:alias/ExampleAlias
- Returns:
- The Amazon Web Services Key Management Service key (Amazon Web Services KMS) that Amazon SageMaker AI
uses to encrypt your output models with Amazon S3 server-side encryption after compilation job. If you
don't provide a KMS key ID, Amazon SageMaker AI uses the default KMS key for Amazon S3 for your role's
account. For more information, see KMS-Managed
Encryption Keys in the Amazon Simple Storage Service Developer Guide.
The KmsKeyId can be any of the following formats:
-
Key ID:
1234abcd-12ab-34cd-56ef-1234567890ab -
Key ARN:
arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab -
Alias name:
alias/ExampleAlias -
Alias name ARN:
arn:aws:kms:us-west-2:111122223333:alias/ExampleAlias
-
-
-
toBuilder
public OutputConfig.Builder toBuilder()
- Specified by:
toBuilderin interfaceToCopyableBuilder<OutputConfig.Builder,OutputConfig>
-
builder
public static OutputConfig.Builder builder()
-
serializableBuilderClass
public static Class<? extends OutputConfig.Builder> serializableBuilderClass()
-
equalsBySdkFields
public final boolean equalsBySdkFields(Object obj)
- Specified by:
equalsBySdkFieldsin interfaceSdkPojo
-
toString
public final String toString()
Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value.
-
sdkFieldNameToField
public final Map<String,SdkField<?>> sdkFieldNameToField()
- Specified by:
sdkFieldNameToFieldin interfaceSdkPojo
-
-