Interface ContainerDefinition.Builder
-
- All Superinterfaces:
Buildable
,CopyableBuilder<ContainerDefinition.Builder,ContainerDefinition>
,SdkBuilder<ContainerDefinition.Builder,ContainerDefinition>
,SdkPojo
- Enclosing class:
- ContainerDefinition
public static interface ContainerDefinition.Builder extends SdkPojo, CopyableBuilder<ContainerDefinition.Builder,ContainerDefinition>
-
-
Method Summary
All Methods Instance Methods Abstract Methods Default Methods Modifier and Type Method Description ContainerDefinition.Builder
additionalModelDataSources(Collection<AdditionalModelDataSource> additionalModelDataSources)
Data sources that are available to your model in addition to the one that you specify forModelDataSource
when you use theCreateModel
action.ContainerDefinition.Builder
additionalModelDataSources(Consumer<AdditionalModelDataSource.Builder>... additionalModelDataSources)
Data sources that are available to your model in addition to the one that you specify forModelDataSource
when you use theCreateModel
action.ContainerDefinition.Builder
additionalModelDataSources(AdditionalModelDataSource... additionalModelDataSources)
Data sources that are available to your model in addition to the one that you specify forModelDataSource
when you use theCreateModel
action.ContainerDefinition.Builder
containerHostname(String containerHostname)
This parameter is ignored for models that contain only aPrimaryContainer
.ContainerDefinition.Builder
environment(Map<String,String> environment)
The environment variables to set in the Docker container.ContainerDefinition.Builder
image(String image)
The path where inference code is stored.default ContainerDefinition.Builder
imageConfig(Consumer<ImageConfig.Builder> imageConfig)
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC).ContainerDefinition.Builder
imageConfig(ImageConfig imageConfig)
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC).ContainerDefinition.Builder
inferenceSpecificationName(String inferenceSpecificationName)
The inference specification name in the model package version.ContainerDefinition.Builder
mode(String mode)
Whether the container hosts a single model or multiple models.ContainerDefinition.Builder
mode(ContainerMode mode)
Whether the container hosts a single model or multiple models.default ContainerDefinition.Builder
modelDataSource(Consumer<ModelDataSource.Builder> modelDataSource)
Specifies the location of ML model data to deploy.ContainerDefinition.Builder
modelDataSource(ModelDataSource modelDataSource)
Specifies the location of ML model data to deploy.ContainerDefinition.Builder
modelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored.ContainerDefinition.Builder
modelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.default ContainerDefinition.Builder
multiModelConfig(Consumer<MultiModelConfig.Builder> multiModelConfig)
Specifies additional configuration for multi-model endpoints.ContainerDefinition.Builder
multiModelConfig(MultiModelConfig multiModelConfig)
Specifies additional configuration for multi-model endpoints.-
Methods inherited from interface software.amazon.awssdk.utils.builder.CopyableBuilder
copy
-
Methods inherited from interface software.amazon.awssdk.utils.builder.SdkBuilder
applyMutation, build
-
Methods inherited from interface software.amazon.awssdk.core.SdkPojo
equalsBySdkFields, sdkFields
-
-
-
-
Method Detail
-
containerHostname
ContainerDefinition.Builder containerHostname(String containerHostname)
This parameter is ignored for models that contain only a
PrimaryContainer
.When a
ContainerDefinition
is part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for aContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned based on the position of theContainerDefinition
in the pipeline. If you specify a value for theContainerHostName
for anyContainerDefinition
that is part of an inference pipeline, you must specify a value for theContainerHostName
parameter of everyContainerDefinition
in that pipeline.- Parameters:
containerHostname
- This parameter is ignored for models that contain only aPrimaryContainer
.When a
ContainerDefinition
is part of an inference pipeline, the value of the parameter uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for aContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned based on the position of theContainerDefinition
in the pipeline. If you specify a value for theContainerHostName
for anyContainerDefinition
that is part of an inference pipeline, you must specify a value for theContainerHostName
parameter of everyContainerDefinition
in that pipeline.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
image
ContainerDefinition.Builder image(String image)
The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both
registry/repository[:tag]
andregistry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
- Parameters:
image
- The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports bothregistry/repository[:tag]
andregistry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
imageConfig
ContainerDefinition.Builder imageConfig(ImageConfig imageConfig)
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers.
The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
- Parameters:
imageConfig
- Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers.The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
imageConfig
default ContainerDefinition.Builder imageConfig(Consumer<ImageConfig.Builder> imageConfig)
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers.
The model artifacts in an Amazon S3 bucket and the Docker image for inference container in Amazon EC2 Container Registry must be in the same region as the model or endpoint you are creating.
ImageConfig.Builder
avoiding the need to create one manually viaImageConfig.builder()
.When the
Consumer
completes,SdkBuilder.build()
is called immediately and its result is passed toimageConfig(ImageConfig)
.- Parameters:
imageConfig
- a consumer that will call methods onImageConfig.Builder
- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
imageConfig(ImageConfig)
-
mode
ContainerDefinition.Builder mode(String mode)
Whether the container hosts a single model or multiple models.
- Parameters:
mode
- Whether the container hosts a single model or multiple models.- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
ContainerMode
,ContainerMode
-
mode
ContainerDefinition.Builder mode(ContainerMode mode)
Whether the container hosts a single model or multiple models.
- Parameters:
mode
- Whether the container hosts a single model or multiple models.- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
ContainerMode
,ContainerMode
-
modelDataUrl
ContainerDefinition.Builder modelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.
If you provide a value for this parameter, SageMaker uses Amazon Web Services Security Token Service to download model artifacts from the S3 path you provide. Amazon Web Services STS is activated in your Amazon Web Services account by default. If you previously deactivated Amazon Web Services STS for a region, you need to reactivate Amazon Web Services STS for that region. For more information, see Activating and Deactivating Amazon Web Services STS in an Amazon Web Services Region in the Amazon Web Services Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, SageMaker requires that you provide a S3 path to the model artifacts in
ModelDataUrl
.- Parameters:
modelDataUrl
- The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.The model artifacts must be in an S3 bucket that is in the same region as the model or endpoint you are creating.
If you provide a value for this parameter, SageMaker uses Amazon Web Services Security Token Service to download model artifacts from the S3 path you provide. Amazon Web Services STS is activated in your Amazon Web Services account by default. If you previously deactivated Amazon Web Services STS for a region, you need to reactivate Amazon Web Services STS for that region. For more information, see Activating and Deactivating Amazon Web Services STS in an Amazon Web Services Region in the Amazon Web Services Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, SageMaker requires that you provide a S3 path to the model artifacts in
ModelDataUrl
.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
modelDataSource
ContainerDefinition.Builder modelDataSource(ModelDataSource modelDataSource)
Specifies the location of ML model data to deploy.
Currently you cannot use
ModelDataSource
in conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.- Parameters:
modelDataSource
- Specifies the location of ML model data to deploy.Currently you cannot use
ModelDataSource
in conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
modelDataSource
default ContainerDefinition.Builder modelDataSource(Consumer<ModelDataSource.Builder> modelDataSource)
Specifies the location of ML model data to deploy.
Currently you cannot use
ModelDataSource
in conjunction with SageMaker batch transform, SageMaker serverless endpoints, SageMaker multi-model endpoints, and SageMaker Marketplace.ModelDataSource.Builder
avoiding the need to create one manually viaModelDataSource.builder()
.When the
Consumer
completes,SdkBuilder.build()
is called immediately and its result is passed tomodelDataSource(ModelDataSource)
.- Parameters:
modelDataSource
- a consumer that will call methods onModelDataSource.Builder
- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
modelDataSource(ModelDataSource)
-
additionalModelDataSources
ContainerDefinition.Builder additionalModelDataSources(Collection<AdditionalModelDataSource> additionalModelDataSources)
Data sources that are available to your model in addition to the one that you specify for
ModelDataSource
when you use theCreateModel
action.- Parameters:
additionalModelDataSources
- Data sources that are available to your model in addition to the one that you specify forModelDataSource
when you use theCreateModel
action.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
additionalModelDataSources
ContainerDefinition.Builder additionalModelDataSources(AdditionalModelDataSource... additionalModelDataSources)
Data sources that are available to your model in addition to the one that you specify for
ModelDataSource
when you use theCreateModel
action.- Parameters:
additionalModelDataSources
- Data sources that are available to your model in addition to the one that you specify forModelDataSource
when you use theCreateModel
action.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
additionalModelDataSources
ContainerDefinition.Builder additionalModelDataSources(Consumer<AdditionalModelDataSource.Builder>... additionalModelDataSources)
Data sources that are available to your model in addition to the one that you specify for
This is a convenience method that creates an instance of theModelDataSource
when you use theCreateModel
action.AdditionalModelDataSource.Builder
avoiding the need to create one manually viaAdditionalModelDataSource.builder()
.When the
Consumer
completes,SdkBuilder.build()
is called immediately and its result is passed to#additionalModelDataSources(List
.) - Parameters:
additionalModelDataSources
- a consumer that will call methods onAdditionalModelDataSource.Builder
- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
#additionalModelDataSources(java.util.Collection
)
-
environment
ContainerDefinition.Builder environment(Map<String,String> environment)
The environment variables to set in the Docker container.
The maximum length of each key and value in the
Environment
map is 1024 bytes. The maximum length of all keys and values in the map, combined, is 32 KB. If you pass multiple containers to aCreateModel
request, then the maximum length of all of their maps, combined, is also 32 KB.- Parameters:
environment
- The environment variables to set in the Docker container.The maximum length of each key and value in the
Environment
map is 1024 bytes. The maximum length of all keys and values in the map, combined, is 32 KB. If you pass multiple containers to aCreateModel
request, then the maximum length of all of their maps, combined, is also 32 KB.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
modelPackageName
ContainerDefinition.Builder modelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
- Parameters:
modelPackageName
- The name or Amazon Resource Name (ARN) of the model package to use to create the model.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
inferenceSpecificationName
ContainerDefinition.Builder inferenceSpecificationName(String inferenceSpecificationName)
The inference specification name in the model package version.
- Parameters:
inferenceSpecificationName
- The inference specification name in the model package version.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
multiModelConfig
ContainerDefinition.Builder multiModelConfig(MultiModelConfig multiModelConfig)
Specifies additional configuration for multi-model endpoints.
- Parameters:
multiModelConfig
- Specifies additional configuration for multi-model endpoints.- Returns:
- Returns a reference to this object so that method calls can be chained together.
-
multiModelConfig
default ContainerDefinition.Builder multiModelConfig(Consumer<MultiModelConfig.Builder> multiModelConfig)
Specifies additional configuration for multi-model endpoints.
This is a convenience method that creates an instance of theMultiModelConfig.Builder
avoiding the need to create one manually viaMultiModelConfig.builder()
.When the
Consumer
completes,SdkBuilder.build()
is called immediately and its result is passed tomultiModelConfig(MultiModelConfig)
.- Parameters:
multiModelConfig
- a consumer that will call methods onMultiModelConfig.Builder
- Returns:
- Returns a reference to this object so that method calls can be chained together.
- See Also:
multiModelConfig(MultiModelConfig)
-
-