@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class TransformInput extends Object implements Serializable, Cloneable, StructuredPojo
Describes the input source of a transform job and the way the transform job consumes it.
Constructor and Description |
---|
TransformInput() |
Modifier and Type | Method and Description |
---|---|
TransformInput |
clone() |
boolean |
equals(Object obj) |
String |
getCompressionType()
Compressing data helps save on storage space.
|
String |
getContentType()
The multipurpose internet mail extension (MIME) type of the data.
|
TransformDataSource |
getDataSource()
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
|
String |
getSplitType()
The method to use to split the transform job's data into smaller batches.
|
int |
hashCode() |
void |
marshall(ProtocolMarshaller protocolMarshaller)
Marshalls this structured data using the given
ProtocolMarshaller . |
void |
setCompressionType(String compressionType)
Compressing data helps save on storage space.
|
void |
setContentType(String contentType)
The multipurpose internet mail extension (MIME) type of the data.
|
void |
setDataSource(TransformDataSource dataSource)
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
|
void |
setSplitType(String splitType)
The method to use to split the transform job's data into smaller batches.
|
String |
toString()
Returns a string representation of this object.
|
TransformInput |
withCompressionType(CompressionType compressionType)
Compressing data helps save on storage space.
|
TransformInput |
withCompressionType(String compressionType)
Compressing data helps save on storage space.
|
TransformInput |
withContentType(String contentType)
The multipurpose internet mail extension (MIME) type of the data.
|
TransformInput |
withDataSource(TransformDataSource dataSource)
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
|
TransformInput |
withSplitType(SplitType splitType)
The method to use to split the transform job's data into smaller batches.
|
TransformInput |
withSplitType(String splitType)
The method to use to split the transform job's data into smaller batches.
|
public void setDataSource(TransformDataSource dataSource)
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
dataSource
- Describes the location of the channel data, meaning the S3 location of the input data that the model can
consume.public TransformDataSource getDataSource()
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
public TransformInput withDataSource(TransformDataSource dataSource)
Describes the location of the channel data, meaning the S3 location of the input data that the model can consume.
dataSource
- Describes the location of the channel data, meaning the S3 location of the input data that the model can
consume.public void setContentType(String contentType)
The multipurpose internet mail extension (MIME) type of the data. Amazon SageMaker uses the MIME type with each http call to transfer data to the transform job.
contentType
- The multipurpose internet mail extension (MIME) type of the data. Amazon SageMaker uses the MIME type with
each http call to transfer data to the transform job.public String getContentType()
The multipurpose internet mail extension (MIME) type of the data. Amazon SageMaker uses the MIME type with each http call to transfer data to the transform job.
public TransformInput withContentType(String contentType)
The multipurpose internet mail extension (MIME) type of the data. Amazon SageMaker uses the MIME type with each http call to transfer data to the transform job.
contentType
- The multipurpose internet mail extension (MIME) type of the data. Amazon SageMaker uses the MIME type with
each http call to transfer data to the transform job.public void setCompressionType(String compressionType)
Compressing data helps save on storage space. If your transform data is compressed, specify the compression type.
Amazon SageMaker automatically decompresses the data for the transform job accordingly. The default value is
None
.
compressionType
- Compressing data helps save on storage space. If your transform data is compressed, specify the
compression type. Amazon SageMaker automatically decompresses the data for the transform job accordingly.
The default value is None
.CompressionType
public String getCompressionType()
Compressing data helps save on storage space. If your transform data is compressed, specify the compression type.
Amazon SageMaker automatically decompresses the data for the transform job accordingly. The default value is
None
.
None
.CompressionType
public TransformInput withCompressionType(String compressionType)
Compressing data helps save on storage space. If your transform data is compressed, specify the compression type.
Amazon SageMaker automatically decompresses the data for the transform job accordingly. The default value is
None
.
compressionType
- Compressing data helps save on storage space. If your transform data is compressed, specify the
compression type. Amazon SageMaker automatically decompresses the data for the transform job accordingly.
The default value is None
.CompressionType
public TransformInput withCompressionType(CompressionType compressionType)
Compressing data helps save on storage space. If your transform data is compressed, specify the compression type.
Amazon SageMaker automatically decompresses the data for the transform job accordingly. The default value is
None
.
compressionType
- Compressing data helps save on storage space. If your transform data is compressed, specify the
compression type. Amazon SageMaker automatically decompresses the data for the transform job accordingly.
The default value is None
.CompressionType
public void setSplitType(String splitType)
The method to use to split the transform job's data into smaller batches. The default value is None
.
If you don't want to split the data, specify None
. If you want to split records on a newline
character boundary, specify Line
. To split records according to the RecordIO format, specify
RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
splitType
- The method to use to split the transform job's data into smaller batches. The default value is
None
. If you don't want to split the data, specify None
. If you want to split
records on a newline character boundary, specify Line
. To split records according to the
RecordIO format, specify RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
SplitType
public String getSplitType()
The method to use to split the transform job's data into smaller batches. The default value is None
.
If you don't want to split the data, specify None
. If you want to split records on a newline
character boundary, specify Line
. To split records according to the RecordIO format, specify
RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
None
. If you don't want to split the data, specify None
. If you want to split
records on a newline character boundary, specify Line
. To split records according to the
RecordIO format, specify RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
SplitType
public TransformInput withSplitType(String splitType)
The method to use to split the transform job's data into smaller batches. The default value is None
.
If you don't want to split the data, specify None
. If you want to split records on a newline
character boundary, specify Line
. To split records according to the RecordIO format, specify
RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
splitType
- The method to use to split the transform job's data into smaller batches. The default value is
None
. If you don't want to split the data, specify None
. If you want to split
records on a newline character boundary, specify Line
. To split records according to the
RecordIO format, specify RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
SplitType
public TransformInput withSplitType(SplitType splitType)
The method to use to split the transform job's data into smaller batches. The default value is None
.
If you don't want to split the data, specify None
. If you want to split records on a newline
character boundary, specify Line
. To split records according to the RecordIO format, specify
RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
splitType
- The method to use to split the transform job's data into smaller batches. The default value is
None
. If you don't want to split the data, specify None
. If you want to split
records on a newline character boundary, specify Line
. To split records according to the
RecordIO format, specify RecordIO
.
Amazon SageMaker will send maximum number of records per batch in each request up to the MaxPayloadInMB limit. For more information, see RecordIO data format.
For information about the RecordIO
format, see Data Format.
SplitType
public String toString()
toString
in class Object
Object.toString()
public TransformInput clone()
public void marshall(ProtocolMarshaller protocolMarshaller)
StructuredPojo
ProtocolMarshaller
.marshall
in interface StructuredPojo
protocolMarshaller
- Implementation of ProtocolMarshaller
used to marshall this object's data.Copyright © 2013 Amazon Web Services, Inc. All Rights Reserved.