The command to run. This value and any associated parameters must function in the environment from which you are running the Task Runner.
An Amazon S3 URI path for a file to download and run as a shell command. Only one scriptUri or command field should be present. scriptUri cannot use parameters, use command instead.
A list of arguments to pass to the shell script.
The Amazon S3 path that receives redirected output from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The path that receives redirected system error messages from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
Determines whether staging is enabled and allows your shell commands to have access to the staged-data variables, such as
$\{INPUT1_STAGING_DIR\
The input data source.
The location for the output.
The timeout time interval for an object attempt.
The timeout time interval for an object attempt. If an attempt does not complete within the start time plus this time interval, AWS Data Pipeline marks the attempt as failed and your retry settings determine the next steps taken.
The command to run.
The command to run. This value and any associated parameters must function in the environment from which you are running the Task Runner.
One or more references to other Activities that must reach the FINISHED state before this activity will start.
One or more references to other Activities that must reach the FINISHED state before this activity will start.
Determines whether pipeline object failures and rerun commands cascade through pipeline object dependencies
Determines whether pipeline object failures and rerun commands cascade through pipeline object dependencies
Possible values include cascade and none.
The ID of the object, IDs must be unique within a pipeline definition
The ID of the object, IDs must be unique within a pipeline definition
The input data source.
The time period in which the object run must start.
The time period in which the object run must start. If the object does not start within the scheduled start time plus this time interval, it is considered late
The maximum number of times to retry the action.
The maximum number of times to retry the action. The default value is 2, which results in 3 tries total (1 original attempt plus 2 retries). The maximum value is 5 (6 total attempts).
The optional, user-defined label of the object.
The optional, user-defined label of the object. If you do not provide a name for an object in a pipeline definition, AWS Data Pipeline automatically duplicates the value of id.
The SNS alarm to raise when the activity fails.
The SNS alarm to raise when the activity fails.
The SNS alarm to raise when the activity fails to start on time.
The SNS alarm to raise when the activity fails to start on time.
The SNS alarm to raise when the activity succeeds.
The SNS alarm to raise when the activity succeeds.
The location for the output.
A condition that must be met before the object can run.
A condition that must be met before the object can run. To specify multiple conditions, add multiple precondition fields. The activity cannot run until all its conditions are met.
The timeout duration between two retry attempts.
The timeout duration between two retry attempts. The default is 10 minutes.
A list of arguments to pass to the shell script.
An Amazon S3 URI path for a file to download and run as a shell command.
An Amazon S3 URI path for a file to download and run as a shell command. Only one scriptUri or command field should be present. scriptUri cannot use parameters, use command instead.
Determines whether staging is enabled and allows your shell commands to have access to the staged-data variables, such as
Determines whether staging is enabled and allows your shell commands to have access to the staged-data variables, such as
$\{INPUT1_STAGING_DIR\
The path that receives redirected system error messages from the command.
The path that receives redirected system error messages from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The Amazon S3 path that receives redirected output from the command.
The Amazon S3 path that receives redirected output from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The type of object.
The type of object. Use one of the predefined AWS Data Pipeline object types.
The worker group.
The worker group. This is used for routing tasks. If you provide a runsOn value and workerGroup exists, workerGroup is ignored.
Runs a command on an EC2 node. You specify the input S3 location, output S3 location and the script/command.
The command to run. This value and any associated parameters must function in the environment from which you are running the Task Runner.
An Amazon S3 URI path for a file to download and run as a shell command. Only one scriptUri or command field should be present. scriptUri cannot use parameters, use command instead.
A list of arguments to pass to the shell script.
The Amazon S3 path that receives redirected output from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The path that receives redirected system error messages from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
Determines whether staging is enabled and allows your shell commands to have access to the staged-data variables, such as
The input data source.
The location for the output.