@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class ContainerDefinition extends Object implements Serializable, Cloneable, StructuredPojo
Describes the container, as part of model definition.
Constructor and Description |
---|
ContainerDefinition() |
Modifier and Type | Method and Description |
---|---|
ContainerDefinition |
addEnvironmentEntry(String key,
String value) |
ContainerDefinition |
clearEnvironmentEntries()
Removes all the entries added into Environment.
|
ContainerDefinition |
clone() |
boolean |
equals(Object obj) |
String |
getContainerHostname()
This parameter is ignored for models that contain only a
PrimaryContainer . |
Map<String,String> |
getEnvironment()
The environment variables to set in the Docker container.
|
String |
getImage()
The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.
|
String |
getModelDataUrl()
The S3 path where the model artifacts, which result from model training, are stored.
|
String |
getModelPackageName()
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
|
int |
hashCode() |
void |
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . |
void |
setContainerHostname(String containerHostname)
This parameter is ignored for models that contain only a
PrimaryContainer . |
void |
setEnvironment(Map<String,String> environment)
The environment variables to set in the Docker container.
|
void |
setImage(String image)
The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.
|
void |
setModelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored.
|
void |
setModelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
|
String |
toString()
Returns a string representation of this object.
|
ContainerDefinition |
withContainerHostname(String containerHostname)
This parameter is ignored for models that contain only a
PrimaryContainer . |
ContainerDefinition |
withEnvironment(Map<String,String> environment)
The environment variables to set in the Docker container.
|
ContainerDefinition |
withImage(String image)
The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored.
|
ContainerDefinition |
withModelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored.
|
ContainerDefinition |
withModelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
|
public void setContainerHostname(String containerHostname)
This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of ths parameter uniquely
identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics
to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a
ContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned
based on the position of the ContainerDefinition
in the pipeline. If you specify a value for the
ContainerHostName
for any ContainerDefinition
that is part of an inference pipeline,
you must specify a value for the ContainerHostName
parameter of every
ContainerDefinition
in that pipeline.
containerHostname
- This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of ths parameter
uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and
Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a
ContainerDefinition
that is part of an inference pipeline, a unique name is automatically
assigned based on the position of the ContainerDefinition
in the pipeline. If you specify a
value for the ContainerHostName
for any ContainerDefinition
that is part of an
inference pipeline, you must specify a value for the ContainerHostName
parameter of every
ContainerDefinition
in that pipeline.
public String getContainerHostname()
This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of ths parameter uniquely
identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics
to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a
ContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned
based on the position of the ContainerDefinition
in the pipeline. If you specify a value for the
ContainerHostName
for any ContainerDefinition
that is part of an inference pipeline,
you must specify a value for the ContainerHostName
parameter of every
ContainerDefinition
in that pipeline.
PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of ths parameter
uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and
Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a
ContainerDefinition
that is part of an inference pipeline, a unique name is automatically
assigned based on the position of the ContainerDefinition
in the pipeline. If you specify a
value for the ContainerHostName
for any ContainerDefinition
that is part of an
inference pipeline, you must specify a value for the ContainerHostName
parameter of every
ContainerDefinition
in that pipeline.
public ContainerDefinition withContainerHostname(String containerHostname)
This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of ths parameter uniquely
identifies the container for the purposes of logging and metrics. For information, see Use Logs and Metrics
to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a
ContainerDefinition
that is part of an inference pipeline, a unique name is automatically assigned
based on the position of the ContainerDefinition
in the pipeline. If you specify a value for the
ContainerHostName
for any ContainerDefinition
that is part of an inference pipeline,
you must specify a value for the ContainerHostName
parameter of every
ContainerDefinition
in that pipeline.
containerHostname
- This parameter is ignored for models that contain only a PrimaryContainer
.
When a ContainerDefinition
is part of an inference pipeline, the value of ths parameter
uniquely identifies the container for the purposes of logging and metrics. For information, see Use Logs and
Metrics to Monitor an Inference Pipeline. If you don't specify a value for this parameter for a
ContainerDefinition
that is part of an inference pipeline, a unique name is automatically
assigned based on the position of the ContainerDefinition
in the pipeline. If you specify a
value for the ContainerHostName
for any ContainerDefinition
that is part of an
inference pipeline, you must specify a value for the ContainerHostName
parameter of every
ContainerDefinition
in that pipeline.
public void setImage(String image)
The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored. If you are using your own
custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet Amazon
SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and
registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon
SageMaker
image
- The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored. If you are using your
own custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet
Amazon SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and
registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with
Amazon SageMakerpublic String getImage()
The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored. If you are using your own
custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet Amazon
SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and
registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon
SageMaker
registry/repository[:tag]
and
registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms
with Amazon SageMakerpublic ContainerDefinition withImage(String image)
The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored. If you are using your own
custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet Amazon
SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and
registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon
SageMaker
image
- The Amazon EC2 Container Registry (Amazon ECR) path where inference code is stored. If you are using your
own custom algorithm instead of an algorithm provided by Amazon SageMaker, the inference code must meet
Amazon SageMaker requirements. Amazon SageMaker supports both registry/repository[:tag]
and
registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with
Amazon SageMakerpublic void setModelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
If you provide a value for this parameter, Amazon SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your IAM user account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to the
model artifacts in ModelDataUrl
.
modelDataUrl
- The S3 path where the model artifacts, which result from model training, are stored. This path must point
to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker
built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms,
see Common
Parameters.
If you provide a value for this parameter, Amazon SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your IAM user account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to
the model artifacts in ModelDataUrl
.
public String getModelDataUrl()
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
If you provide a value for this parameter, Amazon SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your IAM user account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to the
model artifacts in ModelDataUrl
.
If you provide a value for this parameter, Amazon SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your IAM user account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path
to the model artifacts in ModelDataUrl
.
public ContainerDefinition withModelDataUrl(String modelDataUrl)
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
If you provide a value for this parameter, Amazon SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your IAM user account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to the
model artifacts in ModelDataUrl
.
modelDataUrl
- The S3 path where the model artifacts, which result from model training, are stored. This path must point
to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for Amazon SageMaker
built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms,
see Common
Parameters.
If you provide a value for this parameter, Amazon SageMaker uses AWS Security Token Service to download model artifacts from the S3 path you provide. AWS STS is activated in your IAM user account by default. If you previously deactivated AWS STS for a region, you need to reactivate AWS STS for that region. For more information, see Activating and Deactivating AWS STS in an AWS Region in the AWS Identity and Access Management User Guide.
If you use a built-in algorithm to create a model, Amazon SageMaker requires that you provide a S3 path to
the model artifacts in ModelDataUrl
.
public Map<String,String> getEnvironment()
The environment variables to set in the Docker container. Each key and value in the Environment
string to string map can have length of up to 1024. We support up to 16 entries in the map.
Environment
string to string map can have length of up to 1024. We support up to 16 entries
in the map.public void setEnvironment(Map<String,String> environment)
The environment variables to set in the Docker container. Each key and value in the Environment
string to string map can have length of up to 1024. We support up to 16 entries in the map.
environment
- The environment variables to set in the Docker container. Each key and value in the
Environment
string to string map can have length of up to 1024. We support up to 16 entries
in the map.public ContainerDefinition withEnvironment(Map<String,String> environment)
The environment variables to set in the Docker container. Each key and value in the Environment
string to string map can have length of up to 1024. We support up to 16 entries in the map.
environment
- The environment variables to set in the Docker container. Each key and value in the
Environment
string to string map can have length of up to 1024. We support up to 16 entries
in the map.public ContainerDefinition addEnvironmentEntry(String key, String value)
public ContainerDefinition clearEnvironmentEntries()
public void setModelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
modelPackageName
- The name or Amazon Resource Name (ARN) of the model package to use to create the model.public String getModelPackageName()
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
public ContainerDefinition withModelPackageName(String modelPackageName)
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
modelPackageName
- The name or Amazon Resource Name (ARN) of the model package to use to create the model.public String toString()
toString
in class Object
Object.toString()
public ContainerDefinition clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
- Implementation of ProtocolMarshaller
used to marshall this object's data.Copyright © 2013 Amazon Web Services, Inc. All Rights Reserved.