The Hive script to run.
The location of the Hive script to run. For example, s3://script location.
Specifies script variables for Amazon EMR to pass to Hive while running a script. For example, the following example script variables would pass a SAMPLE and FILTER_DATE variable to Hive: SAMPLE=s3://elasticmapreduce/samples/hive-ads and FILTER_DATE=#{format(@scheduledStartTime,'YYYY-MM-dd')}% This field accepts multiple values and works with both script and scriptUri fields. In addition, scriptVariable functions regardless of stage set to true or false. This field is especially useful to send dynamic values to Hive using AWS Data Pipeline expressions and functions. For more information, see Pipeline Expressions and Functions.
Determines whether staging is enabled. Not permitted with Hive 11, so use an Amazon EMR AMI version 3.2.0 or greater.
The input data source. Data node object reference Yes
The location for the output. Data node object reference Yes
The Amazon EMR cluster to run this activity. EmrCluster object reference Yes
The timeout time interval for an object attempt.
The timeout time interval for an object attempt. If an attempt does not complete within the start time plus this time interval, AWS Data Pipeline marks the attempt as failed and your retry settings determine the next steps taken.
One or more references to other Activities that must reach the FINISHED state before this activity will start.
One or more references to other Activities that must reach the FINISHED state before this activity will start.
Determines whether pipeline object failures and rerun commands cascade through pipeline object dependencies
Determines whether pipeline object failures and rerun commands cascade through pipeline object dependencies
Possible values include cascade and none.
The Hive script to run.
The ID of the object, IDs must be unique within a pipeline definition
The ID of the object, IDs must be unique within a pipeline definition
The input data source.
The input data source. Data node object reference Yes
The time period in which the object run must start.
The time period in which the object run must start. If the object does not start within the scheduled start time plus this time interval, it is considered late
The maximum number of times to retry the action.
The maximum number of times to retry the action. The default value is 2, which results in 3 tries total (1 original attempt plus 2 retries). The maximum value is 5 (6 total attempts).
The optional, user-defined label of the object.
The optional, user-defined label of the object. If you do not provide a name for an object in a pipeline definition, AWS Data Pipeline automatically duplicates the value of id.
The SNS alarm to raise when the activity fails.
The SNS alarm to raise when the activity fails.
The SNS alarm to raise when the activity fails to start on time.
The SNS alarm to raise when the activity fails to start on time.
The SNS alarm to raise when the activity succeeds.
The SNS alarm to raise when the activity succeeds.
The location for the output.
The location for the output. Data node object reference Yes
A condition that must be met before the object can run.
A condition that must be met before the object can run. To specify multiple conditions, add multiple precondition fields. The activity cannot run until all its conditions are met.
The timeout duration between two retry attempts.
The timeout duration between two retry attempts. The default is 10 minutes.
The Amazon EMR cluster to run this activity.
The Amazon EMR cluster to run this activity. EmrCluster object reference Yes
The location of the Hive script to run.
The location of the Hive script to run. For example, s3://script location.
Specifies script variables for Amazon EMR to pass to Hive while running a script.
Specifies script variables for Amazon EMR to pass to Hive while running a script. For example, the following example script variables would pass a SAMPLE and FILTER_DATE variable to Hive: SAMPLE=s3://elasticmapreduce/samples/hive-ads and FILTER_DATE=#{format(@scheduledStartTime,'YYYY-MM-dd')}% This field accepts multiple values and works with both script and scriptUri fields. In addition, scriptVariable functions regardless of stage set to true or false. This field is especially useful to send dynamic values to Hive using AWS Data Pipeline expressions and functions. For more information, see Pipeline Expressions and Functions.
Determines whether staging is enabled.
Determines whether staging is enabled. Not permitted with Hive 11, so use an Amazon EMR AMI version 3.2.0 or greater.
The type of object.
The type of object. Use one of the predefined AWS Data Pipeline object types.
The worker group.
The worker group. This is used for routing tasks. If you provide a runsOn value and workerGroup exists, workerGroup is ignored.
ref: http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-object-hiveactivity.html