The command to run. This value and any associated parameters must function in the environment from which you are running the Task Runner.
An Amazon S3 URI path for a file to download and run as a shell command. Only one scriptUri or command field should be present. scriptUri cannot use parameters, use command instead.
A list of arguments to pass to the shell script.
The Amazon S3 path that receives redirected output from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The Amazon S3 path that receives redirected system error messages from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The command to run.
The command to run. This value and any associated parameters must function in the environment from which you are running the Task Runner.
The ID of the object, IDs must be unique within a pipeline definition
The ID of the object, IDs must be unique within a pipeline definition
Maximum number attempt retries on failure
Maximum number attempt retries on failure
The optional, user-defined label of the object.
The optional, user-defined label of the object. If you do not provide a name for an object in a pipeline definition, AWS Data Pipeline automatically duplicates the value of id.
Actions to run when current object fails.
Actions to run when current object fails.
Actions that should be triggered if an object has not yet been scheduled or still not completed.
Actions that should be triggered if an object has not yet been scheduled or still not completed.
Actions to run when current object succeeds.
Actions to run when current object succeeds.
The precondition will be retried until the retryTimeout with a gap of retryDelay between attempts.
The precondition will be retried until the retryTimeout with a gap of retryDelay between attempts. Time period; for example, "1 hour".
The IAM role to use for this precondition.
The IAM role to use for this precondition.
A list of arguments to pass to the shell script.
An Amazon S3 URI path for a file to download and run as a shell command.
An Amazon S3 URI path for a file to download and run as a shell command. Only one scriptUri or command field should be present. scriptUri cannot use parameters, use command instead.
The Amazon S3 path that receives redirected system error messages from the command.
The Amazon S3 path that receives redirected system error messages from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The Amazon S3 path that receives redirected output from the command.
The Amazon S3 path that receives redirected output from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The type of object.
The type of object. Use one of the predefined AWS Data Pipeline object types.
A Unix/Linux shell command that can be run as a precondition.
The command to run. This value and any associated parameters must function in the environment from which you are running the Task Runner.
An Amazon S3 URI path for a file to download and run as a shell command. Only one scriptUri or command field should be present. scriptUri cannot use parameters, use command instead.
A list of arguments to pass to the shell script.
The Amazon S3 path that receives redirected output from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.
The Amazon S3 path that receives redirected system error messages from the command. If you use the runsOn field, this must be an Amazon S3 path because of the transitory nature of the resource running your activity. However if you specify the workerGroup field, a local file path is permitted.