org.apache.spark.sql.execution

GenerateExec

case class GenerateExec(generator: Generator, join: Boolean, outer: Boolean, generatorOutput: Seq[Attribute], child: SparkPlan) extends SparkPlan with UnaryExecNode with CodegenSupport with Product with Serializable

Applies a Generator to a stream of input rows, combining the output of each into a new stream of rows. This operation is similar to a flatMap in functional programming with one important additional feature, which allows the input rows to be joined with their output.

This operator supports whole stage code generation for generators that do not implement terminate().

generator

the generator expression

join

when true, each output row is implicitly joined with the input tuple that produced it.

outer

when true, each input row will be output at least once, even if the output of the given generator is empty.

generatorOutput

the qualified output attributes of the generator of this node, which constructed in analysis phase, and we can not change it, as the parent node bound with it already.

Linear Supertypes
CodegenSupport, UnaryExecNode, SparkPlan, Serializable, Serializable, Logging, QueryPlan[SparkPlan], TreeNode[SparkPlan], Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. GenerateExec
  2. CodegenSupport
  3. UnaryExecNode
  4. SparkPlan
  5. Serializable
  6. Serializable
  7. Logging
  8. QueryPlan
  9. TreeNode
  10. Product
  11. Equals
  12. AnyRef
  13. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new GenerateExec(generator: Generator, join: Boolean, outer: Boolean, generatorOutput: Seq[Attribute], child: SparkPlan)

    generator

    the generator expression

    join

    when true, each output row is implicitly joined with the input tuple that produced it.

    outer

    when true, each input row will be output at least once, even if the output of the given generator is empty.

    generatorOutput

    the qualified output attributes of the generator of this node, which constructed in analysis phase, and we can not change it, as the parent node bound with it already.

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. lazy val allAttributes: AttributeSeq

    Definition Classes
    QueryPlan
  7. def apply(number: Int): TreeNode[_]

    Definition Classes
    TreeNode
  8. def argString: String

    Definition Classes
    TreeNode
  9. def asCode: String

    Definition Classes
    TreeNode
  10. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  11. lazy val boundGenerator: Generator

  12. lazy val canonicalized: SparkPlan

    Definition Classes
    QueryPlan
  13. val child: SparkPlan

    Definition Classes
    GenerateExecUnaryExecNode
  14. final def children: Seq[SparkPlan]

    Definition Classes
    UnaryExecNode → TreeNode
  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. def collect[B](pf: PartialFunction[SparkPlan, B]): Seq[B]

    Definition Classes
    TreeNode
  17. def collectFirst[B](pf: PartialFunction[SparkPlan, B]): Option[B]

    Definition Classes
    TreeNode
  18. def collectLeaves(): Seq[SparkPlan]

    Definition Classes
    TreeNode
  19. lazy val constraints: ExpressionSet

    Definition Classes
    QueryPlan
  20. final def consume(ctx: CodegenContext, outputVars: Seq[ExprCode], row: String = null): String

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Definition Classes
    CodegenSupport
  21. lazy val containsChild: Set[TreeNode[_]]

    Definition Classes
    TreeNode
  22. def doConsume(ctx: CodegenContext, input: Seq[ExprCode], row: ExprCode): String

    Generate the Java source code to process the rows from child SparkPlan.

    Generate the Java source code to process the rows from child SparkPlan.

    This should be override by subclass to support codegen.

    For example, Filter will generate the code like this:

    # code to evaluate the predicate expression, result is isNull1 and value2 if (isNull1 || !value2) continue; # call consume(), which will call parent.doConsume()

    Note: A plan can either consume the rows as UnsafeRow (row), or a list of variables (input).

    Definition Classes
    GenerateExecCodegenSupport
  23. def doExecute(): RDD[InternalRow]

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as an RDD[InternalRow]

    Attributes
    protected
    Definition Classes
    GenerateExecSparkPlan
  24. def doExecuteBroadcast[T](): Broadcast[T]

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as a broadcast variable.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SparkPlan
  25. def doPrepare(): Unit

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. It is guaranteed to run before any execute of SparkPlan. This is helpful if we want to set up some state before executing the query, e.g., BroadcastHashJoin uses it to broadcast asynchronously.

    Note: the prepare method has already walked down the tree, so the implementation doesn't need to call children's prepare methods.

    This will only be called once, protected by this.

    Attributes
    protected
    Definition Classes
    SparkPlan
  26. def doProduce(ctx: CodegenContext): String

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    doProduce() usually generate the framework, for example, aggregation could generate this:

    if (!initialized) { # create a hash map, then build the aggregation hash map # call child.produce() initialized = true; } while (hashmap.hasNext()) { row = hashmap.next(); # build the aggregation results # create variables for results # call consume(), which will call parent.doConsume() if (shouldStop()) return; }

    Attributes
    protected
    Definition Classes
    GenerateExecCodegenSupport
  27. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  28. def evaluateRequiredVariables(attributes: Seq[Attribute], variables: Seq[ExprCode], required: AttributeSet): String

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  29. def evaluateVariables(variables: Seq[ExprCode]): String

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  30. final def execute(): RDD[InternalRow]

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Concrete implementations of SparkPlan should override doExecute.

    Definition Classes
    SparkPlan
  31. final def executeBroadcast[T](): Broadcast[T]

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Concrete implementations of SparkPlan should override doExecuteBroadcast.

    Definition Classes
    SparkPlan
  32. def executeCollect(): Array[InternalRow]

    Runs this query returning the result as an array.

    Runs this query returning the result as an array.

    Definition Classes
    SparkPlan
  33. def executeCollectPublic(): Array[Row]

    Runs this query returning the result as an array, using external Row format.

    Runs this query returning the result as an array, using external Row format.

    Definition Classes
    SparkPlan
  34. final def executeQuery[T](query: ⇒ T): T

    Execute a query after preparing the query and adding query plan information to created RDDs for visualization.

    Execute a query after preparing the query and adding query plan information to created RDDs for visualization.

    Attributes
    protected
    Definition Classes
    SparkPlan
  35. def executeTake(n: Int): Array[InternalRow]

    Runs this query returning the first n rows as an array.

    Runs this query returning the first n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  36. def executeToIterator(): Iterator[InternalRow]

    Runs this query returning the result as an iterator of InternalRow.

    Runs this query returning the result as an iterator of InternalRow.

    Note: this will trigger multiple jobs (one for each partition).

    Definition Classes
    SparkPlan
  37. final def expressions: Seq[Expression]

    Definition Classes
    QueryPlan
  38. def fastEquals(other: TreeNode[_]): Boolean

    Definition Classes
    TreeNode
  39. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  40. def find(f: (SparkPlan) ⇒ Boolean): Option[SparkPlan]

    Definition Classes
    TreeNode
  41. def flatMap[A](f: (SparkPlan) ⇒ TraversableOnce[A]): Seq[A]

    Definition Classes
    TreeNode
  42. def foreach(f: (SparkPlan) ⇒ Unit): Unit

    Definition Classes
    TreeNode
  43. def foreachUp(f: (SparkPlan) ⇒ Unit): Unit

    Definition Classes
    TreeNode
  44. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], builder: StringBuilder, verbose: Boolean, prefix: String, addSuffix: Boolean): StringBuilder

    Definition Classes
    TreeNode
  45. val generator: Generator

    the generator expression

  46. val generatorOutput: Seq[Attribute]

    the qualified output attributes of the generator of this node, which constructed in analysis phase, and we can not change it, as the parent node bound with it already.

  47. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  48. def getRelevantConstraints(constraints: Set[Expression]): Set[Expression]

    Attributes
    protected
    Definition Classes
    QueryPlan
  49. def hashCode(): Int

    Definition Classes
    TreeNode → AnyRef → Any
  50. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Attributes
    protected
    Definition Classes
    Logging
  51. def innerChildren: Seq[QueryPlan[_]]

    Attributes
    protected
    Definition Classes
    QueryPlan → TreeNode
  52. def inputRDDs(): Seq[RDD[InternalRow]]

    Returns all the RDDs of InternalRow which generates the input rows.

    Returns all the RDDs of InternalRow which generates the input rows.

    Note: right now we support up to two RDDs.

    Definition Classes
    GenerateExecCodegenSupport
  53. def inputSet: AttributeSet

    Definition Classes
    QueryPlan
  54. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  55. def isShouldStopRequired: Boolean

    For optimization to suppress shouldStop() in a loop of WholeStageCodegen.

    For optimization to suppress shouldStop() in a loop of WholeStageCodegen. Returning true means we need to insert shouldStop() into the loop producing rows, if any.

    Definition Classes
    CodegenSupport
  56. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  57. val join: Boolean

    when true, each output row is implicitly joined with the input tuple that produced it.

  58. def jsonFields: List[(String, JValue)]

    Attributes
    protected
    Definition Classes
    TreeNode
  59. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  60. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  61. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  62. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  63. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  64. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  65. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  66. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  67. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  68. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  69. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  70. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  71. def longMetric(name: String): SQLMetric

    Return a LongSQLMetric according to the name.

    Return a LongSQLMetric according to the name.

    Definition Classes
    SparkPlan
  72. def makeCopy(newArgs: Array[AnyRef]): SparkPlan

    Overridden make copy also propagates sqlContext to copied plan.

    Overridden make copy also propagates sqlContext to copied plan.

    Definition Classes
    SparkPlan → TreeNode
  73. def map[A](f: (SparkPlan) ⇒ A): Seq[A]

    Definition Classes
    TreeNode
  74. def mapChildren(f: (SparkPlan) ⇒ SparkPlan): SparkPlan

    Definition Classes
    TreeNode
  75. def mapExpressions(f: (Expression) ⇒ Expression): GenerateExec.this.type

    Definition Classes
    QueryPlan
  76. def mapProductIterator[B](f: (Any) ⇒ B)(implicit arg0: ClassTag[B]): Array[B]

    Attributes
    protected
    Definition Classes
    TreeNode
  77. def metadata: Map[String, String]

    Return all metadata that describes more details of this SparkPlan.

    Return all metadata that describes more details of this SparkPlan.

    Definition Classes
    SparkPlan
  78. def metricTerm(ctx: CodegenContext, name: String): String

    Creates a metric using the specified name.

    Creates a metric using the specified name.

    returns

    name of the variable representing the metric

    Definition Classes
    CodegenSupport
  79. lazy val metrics: Map[String, SQLMetric]

    Definition Classes
    GenerateExecSparkPlan
  80. def missingInput: AttributeSet

    Definition Classes
    QueryPlan
  81. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  82. def newMutableProjection(expressions: Seq[Expression], inputSchema: Seq[Attribute], useSubexprElimination: Boolean = false): MutableProjection

    Attributes
    protected
    Definition Classes
    SparkPlan
  83. def newNaturalAscendingOrdering(dataTypes: Seq[DataType]): Ordering[InternalRow]

    Creates a row ordering for the given schema, in natural ascending order.

    Creates a row ordering for the given schema, in natural ascending order.

    Attributes
    protected
    Definition Classes
    SparkPlan
  84. def newOrdering(order: Seq[SortOrder], inputSchema: Seq[Attribute]): Ordering[InternalRow]

    Attributes
    protected
    Definition Classes
    SparkPlan
  85. def newPredicate(expression: Expression, inputSchema: Seq[Attribute]): Predicate

    Attributes
    protected
    Definition Classes
    SparkPlan
  86. def nodeName: String

    Definition Classes
    TreeNode
  87. final def notify(): Unit

    Definition Classes
    AnyRef
  88. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  89. def numberedTreeString: String

    Definition Classes
    TreeNode
  90. val origin: Origin

    Definition Classes
    TreeNode
  91. def otherCopyArgs: Seq[AnyRef]

    Attributes
    protected
    Definition Classes
    TreeNode
  92. val outer: Boolean

    when true, each input row will be output at least once, even if the output of the given generator is empty.

  93. def output: Seq[Attribute]

    Definition Classes
    GenerateExec → QueryPlan
  94. def outputOrdering: Seq[SortOrder]

    Specifies how data is ordered in each partition.

    Specifies how data is ordered in each partition.

    Definition Classes
    SparkPlan
  95. def outputPartitioning: Partitioning

    Specifies how data is partitioned across different nodes in the cluster.

    Specifies how data is partitioned across different nodes in the cluster.

    Definition Classes
    GenerateExecSparkPlan
  96. def outputSet: AttributeSet

    Definition Classes
    QueryPlan
  97. def p(number: Int): SparkPlan

    Definition Classes
    TreeNode
  98. var parent: CodegenSupport

    Which SparkPlan is calling produce() of this one.

    Which SparkPlan is calling produce() of this one. It's itself for the first SparkPlan.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  99. def preCanonicalized: SparkPlan

    Attributes
    protected
    Definition Classes
    QueryPlan
  100. final def prepare(): Unit

    Prepare a SparkPlan for execution.

    Prepare a SparkPlan for execution. It's idempotent.

    Definition Classes
    SparkPlan
  101. def prepareSubqueries(): Unit

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Attributes
    protected
    Definition Classes
    SparkPlan
  102. def prettyJson: String

    Definition Classes
    TreeNode
  103. def printSchema(): Unit

    Definition Classes
    QueryPlan
  104. final def produce(ctx: CodegenContext, parent: CodegenSupport): String

    Returns Java source code to process the rows from input RDD.

    Returns Java source code to process the rows from input RDD.

    Definition Classes
    CodegenSupport
  105. def producedAttributes: AttributeSet

    Definition Classes
    GenerateExec → QueryPlan
  106. def references: AttributeSet

    Definition Classes
    QueryPlan
  107. def requiredChildDistribution: Seq[Distribution]

    Specifies any partition requirements on the input data for this operator.

    Specifies any partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  108. def requiredChildOrdering: Seq[Seq[SortOrder]]

    Specifies sort order for each partition requirements on the input data for this operator.

    Specifies sort order for each partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  109. def resetMetrics(): Unit

    Reset all the metrics.

    Reset all the metrics.

    Definition Classes
    SparkPlan
  110. final def sameResult(other: SparkPlan): Boolean

    Definition Classes
    QueryPlan
  111. lazy val schema: StructType

    Definition Classes
    QueryPlan
  112. def schemaString: String

    Definition Classes
    QueryPlan
  113. final def semanticHash(): Int

    Definition Classes
    QueryPlan
  114. def shouldStopRequired: Boolean

    Set to false if this plan consumes all rows produced by children but doesn't output row to buffer by calling append(), so the children don't require shouldStop() in the loop of producing rows.

    Set to false if this plan consumes all rows produced by children but doesn't output row to buffer by calling append(), so the children don't require shouldStop() in the loop of producing rows.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  115. def simpleString: String

    Definition Classes
    QueryPlan → TreeNode
  116. def sparkContext: SparkContext

    Attributes
    protected
    Definition Classes
    SparkPlan
  117. final val sqlContext: SQLContext

    A handle to the SQL Context that was used to create this plan.

    A handle to the SQL Context that was used to create this plan. Since many operators need access to the sqlContext for RDD operations or configuration this field is automatically populated by the query planning infrastructure.

    Definition Classes
    SparkPlan
  118. def statePrefix: String

    Attributes
    protected
    Definition Classes
    QueryPlan
  119. def stringArgs: Iterator[Any]

    Attributes
    protected
    Definition Classes
    TreeNode
  120. val subexpressionEliminationEnabled: Boolean

    Definition Classes
    SparkPlan
  121. def subqueries: Seq[SparkPlan]

    Definition Classes
    QueryPlan
  122. def supportCodegen: Boolean

    Whether this SparkPlan support whole stage codegen or not.

    Whether this SparkPlan support whole stage codegen or not.

    Definition Classes
    GenerateExecCodegenSupport
  123. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  124. def toJSON: String

    Definition Classes
    TreeNode
  125. def toString(): String

    Definition Classes
    TreeNode → AnyRef → Any
  126. def transform(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Definition Classes
    TreeNode
  127. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): GenerateExec.this.type

    Definition Classes
    QueryPlan
  128. def transformDown(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Definition Classes
    TreeNode
  129. def transformExpressions(rule: PartialFunction[Expression, Expression]): GenerateExec.this.type

    Definition Classes
    QueryPlan
  130. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): GenerateExec.this.type

    Definition Classes
    QueryPlan
  131. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): GenerateExec.this.type

    Definition Classes
    QueryPlan
  132. def transformUp(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Definition Classes
    TreeNode
  133. def treeString(verbose: Boolean, addSuffix: Boolean): String

    Definition Classes
    TreeNode
  134. def treeString: String

    Definition Classes
    TreeNode
  135. def usedInputs: AttributeSet

    The subset of inputSet those should be evaluated before this plan.

    The subset of inputSet those should be evaluated before this plan.

    We will use this to insert some code to access those columns that are actually used by current plan before calling doConsume().

    Definition Classes
    CodegenSupport
  136. def validConstraints: Set[Expression]

    Attributes
    protected
    Definition Classes
    QueryPlan
  137. def verboseString: String

    Definition Classes
    QueryPlan → TreeNode
  138. def verboseStringWithSuffix: String

    Definition Classes
    TreeNode
  139. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  140. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  141. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  142. def waitForSubqueries(): Unit

    Blocks the thread until all subqueries finish evaluation and update the results.

    Blocks the thread until all subqueries finish evaluation and update the results.

    Attributes
    protected
    Definition Classes
    SparkPlan
  143. def withNewChildren(newChildren: Seq[SparkPlan]): SparkPlan

    Definition Classes
    TreeNode

Inherited from CodegenSupport

Inherited from UnaryExecNode

Inherited from SparkPlan

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from QueryPlan[SparkPlan]

Inherited from TreeNode[SparkPlan]

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped