Class/Object

org.apache.spark.sql.execution.aggregate

HashAggregateExec

Related Docs: object HashAggregateExec | package aggregate

Permalink

case class HashAggregateExec(requiredChildDistributionExpressions: Option[Seq[Expression]], groupingExpressions: Seq[NamedExpression], aggregateExpressions: Seq[AggregateExpression], aggregateAttributes: Seq[Attribute], initialInputBufferOffset: Int, resultExpressions: Seq[NamedExpression], child: SparkPlan) extends SparkPlan with UnaryExecNode with CodegenSupport with Product with Serializable

Hash-based aggregate operator that can also fallback to sorting when data exceeds memory size.

Linear Supertypes
CodegenSupport, UnaryExecNode, SparkPlan, Serializable, Serializable, Logging, QueryPlan[SparkPlan], TreeNode[SparkPlan], Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HashAggregateExec
  2. CodegenSupport
  3. UnaryExecNode
  4. SparkPlan
  5. Serializable
  6. Serializable
  7. Logging
  8. QueryPlan
  9. TreeNode
  10. Product
  11. Equals
  12. AnyRef
  13. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new HashAggregateExec(requiredChildDistributionExpressions: Option[Seq[Expression]], groupingExpressions: Seq[NamedExpression], aggregateExpressions: Seq[AggregateExpression], aggregateAttributes: Seq[Attribute], initialInputBufferOffset: Int, resultExpressions: Seq[NamedExpression], child: SparkPlan)

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val aggregateAttributes: Seq[Attribute]

    Permalink
  5. val aggregateExpressions: Seq[AggregateExpression]

    Permalink
  6. lazy val allAttributes: AttributeSeq

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan
  7. def apply(number: Int): TreeNode[_]

    Permalink
    Definition Classes
    TreeNode
  8. def argString: String

    Permalink
    Definition Classes
    TreeNode
  9. def asCode: String

    Permalink
    Definition Classes
    TreeNode
  10. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  11. lazy val canonicalized: SparkPlan

    Permalink
    Definition Classes
    QueryPlan
  12. val child: SparkPlan

    Permalink
    Definition Classes
    HashAggregateExecUnaryExecNode
  13. final def children: Seq[SparkPlan]

    Permalink
    Definition Classes
    UnaryExecNode → TreeNode
  14. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  15. def collect[B](pf: PartialFunction[SparkPlan, B]): Seq[B]

    Permalink
    Definition Classes
    TreeNode
  16. def collectFirst[B](pf: PartialFunction[SparkPlan, B]): Option[B]

    Permalink
    Definition Classes
    TreeNode
  17. def collectLeaves(): Seq[SparkPlan]

    Permalink
    Definition Classes
    TreeNode
  18. lazy val constraints: ExpressionSet

    Permalink
    Definition Classes
    QueryPlan
  19. final def consume(ctx: CodegenContext, outputVars: Seq[ExprCode], row: String = null): String

    Permalink

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Definition Classes
    CodegenSupport
  20. lazy val containsChild: Set[TreeNode[_]]

    Permalink
    Definition Classes
    TreeNode
  21. def createHashMap(): UnsafeFixedWidthAggregationMap

    Permalink

    This is called by generated Java class, should be public.

  22. def createUnsafeJoiner(): UnsafeRowJoiner

    Permalink

    This is called by generated Java class, should be public.

  23. def doConsume(ctx: CodegenContext, input: Seq[ExprCode], row: ExprCode): String

    Permalink

    Generate the Java source code to process the rows from child SparkPlan.

    Generate the Java source code to process the rows from child SparkPlan.

    This should be override by subclass to support codegen.

    For example, Filter will generate the code like this:

    # code to evaluate the predicate expression, result is isNull1 and value2 if (isNull1 || !value2) continue; # call consume(), which will call parent.doConsume()

    Note: A plan can either consume the rows as UnsafeRow (row), or a list of variables (input).

    Definition Classes
    HashAggregateExecCodegenSupport
  24. def doExecute(): RDD[InternalRow]

    Permalink

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as an RDD[InternalRow]

    Attributes
    protected
    Definition Classes
    HashAggregateExecSparkPlan
  25. def doExecuteBroadcast[T](): Broadcast[T]

    Permalink

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as a broadcast variable.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SparkPlan
  26. def doPrepare(): Unit

    Permalink

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. It is guaranteed to run before any execute of SparkPlan. This is helpful if we want to set up some state before executing the query, e.g., BroadcastHashJoin uses it to broadcast asynchronously.

    Note: the prepare method has already walked down the tree, so the implementation doesn't need to call children's prepare methods.

    This will only be called once, protected by this.

    Attributes
    protected
    Definition Classes
    SparkPlan
  27. def doProduce(ctx: CodegenContext): String

    Permalink

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    doProduce() usually generate the framework, for example, aggregation could generate this:

    if (!initialized) { # create a hash map, then build the aggregation hash map # call child.produce() initialized = true; } while (hashmap.hasNext()) { row = hashmap.next(); # build the aggregation results # create variables for results # call consume(), which will call parent.doConsume() if (shouldStop()) return; }

    Attributes
    protected
    Definition Classes
    HashAggregateExecCodegenSupport
  28. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  29. def evaluateRequiredVariables(attributes: Seq[Attribute], variables: Seq[ExprCode], required: AttributeSet): String

    Permalink

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  30. def evaluateVariables(variables: Seq[ExprCode]): String

    Permalink

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  31. final def execute(): RDD[InternalRow]

    Permalink

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Concrete implementations of SparkPlan should override doExecute.

    Definition Classes
    SparkPlan
  32. final def executeBroadcast[T](): Broadcast[T]

    Permalink

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Concrete implementations of SparkPlan should override doExecuteBroadcast.

    Definition Classes
    SparkPlan
  33. def executeCollect(): Array[InternalRow]

    Permalink

    Runs this query returning the result as an array.

    Runs this query returning the result as an array.

    Definition Classes
    SparkPlan
  34. def executeCollectPublic(): Array[Row]

    Permalink

    Runs this query returning the result as an array, using external Row format.

    Runs this query returning the result as an array, using external Row format.

    Definition Classes
    SparkPlan
  35. final def executeQuery[T](query: ⇒ T): T

    Permalink

    Execute a query after preparing the query and adding query plan information to created RDDs for visualization.

    Execute a query after preparing the query and adding query plan information to created RDDs for visualization.

    Attributes
    protected
    Definition Classes
    SparkPlan
  36. def executeTake(n: Int): Array[InternalRow]

    Permalink

    Runs this query returning the first n rows as an array.

    Runs this query returning the first n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  37. def executeToIterator(): Iterator[InternalRow]

    Permalink

    Runs this query returning the result as an iterator of InternalRow.

    Runs this query returning the result as an iterator of InternalRow.

    Note: this will trigger multiple jobs (one for each partition).

    Definition Classes
    SparkPlan
  38. final def expressions: Seq[Expression]

    Permalink
    Definition Classes
    QueryPlan
  39. def fastEquals(other: TreeNode[_]): Boolean

    Permalink
    Definition Classes
    TreeNode
  40. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  41. def find(f: (SparkPlan) ⇒ Boolean): Option[SparkPlan]

    Permalink
    Definition Classes
    TreeNode
  42. def finishAggregate(hashMap: UnsafeFixedWidthAggregationMap, sorter: UnsafeKVExternalSorter, peakMemory: SQLMetric, spillSize: SQLMetric): KVIterator[UnsafeRow, UnsafeRow]

    Permalink

    Called by generated Java class to finish the aggregate and return a KVIterator.

  43. def flatMap[A](f: (SparkPlan) ⇒ TraversableOnce[A]): Seq[A]

    Permalink
    Definition Classes
    TreeNode
  44. def foreach(f: (SparkPlan) ⇒ Unit): Unit

    Permalink
    Definition Classes
    TreeNode
  45. def foreachUp(f: (SparkPlan) ⇒ Unit): Unit

    Permalink
    Definition Classes
    TreeNode
  46. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], builder: StringBuilder, verbose: Boolean, prefix: String, addSuffix: Boolean): StringBuilder

    Permalink
    Definition Classes
    TreeNode
  47. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  48. def getEmptyAggregationBuffer(): InternalRow

    Permalink
  49. def getRelevantConstraints(constraints: Set[Expression]): Set[Expression]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  50. def getTaskMemoryManager(): TaskMemoryManager

    Permalink
  51. val groupingExpressions: Seq[NamedExpression]

    Permalink
  52. def hashCode(): Int

    Permalink
    Definition Classes
    TreeNode → AnyRef → Any
  53. val initialInputBufferOffset: Int

    Permalink
  54. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  55. def innerChildren: Seq[QueryPlan[_]]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan → TreeNode
  56. def inputRDDs(): Seq[RDD[InternalRow]]

    Permalink

    Returns all the RDDs of InternalRow which generates the input rows.

    Returns all the RDDs of InternalRow which generates the input rows.

    Note: right now we support up to two RDDs.

    Definition Classes
    HashAggregateExecCodegenSupport
  57. def inputSet: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  58. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  59. def isShouldStopRequired: Boolean

    Permalink

    For optimization to suppress shouldStop() in a loop of WholeStageCodegen.

    For optimization to suppress shouldStop() in a loop of WholeStageCodegen. Returning true means we need to insert shouldStop() into the loop producing rows, if any.

    Definition Classes
    CodegenSupport
  60. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  61. def jsonFields: List[JField]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  62. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  63. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  64. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  65. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  66. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  67. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  68. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  69. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  70. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  71. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  72. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  73. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  74. def longMetric(name: String): SQLMetric

    Permalink

    Return a LongSQLMetric according to the name.

    Return a LongSQLMetric according to the name.

    Definition Classes
    SparkPlan
  75. def makeCopy(newArgs: Array[AnyRef]): SparkPlan

    Permalink

    Overridden make copy also propagates sqlContext to copied plan.

    Overridden make copy also propagates sqlContext to copied plan.

    Definition Classes
    SparkPlan → TreeNode
  76. def map[A](f: (SparkPlan) ⇒ A): Seq[A]

    Permalink
    Definition Classes
    TreeNode
  77. def mapChildren(f: (SparkPlan) ⇒ SparkPlan): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  78. def mapExpressions(f: (Expression) ⇒ Expression): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  79. def mapProductIterator[B](f: (Any) ⇒ B)(implicit arg0: ClassTag[B]): Array[B]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  80. def metadata: Map[String, String]

    Permalink

    Return all metadata that describes more details of this SparkPlan.

    Return all metadata that describes more details of this SparkPlan.

    Definition Classes
    SparkPlan
  81. def metricTerm(ctx: CodegenContext, name: String): String

    Permalink

    Creates a metric using the specified name.

    Creates a metric using the specified name.

    returns

    name of the variable representing the metric

    Definition Classes
    CodegenSupport
  82. lazy val metrics: Map[String, SQLMetric]

    Permalink

    Return all metrics containing metrics of this SparkPlan.

    Return all metrics containing metrics of this SparkPlan.

    Definition Classes
    HashAggregateExecSparkPlan
  83. def missingInput: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  84. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  85. def newMutableProjection(expressions: Seq[Expression], inputSchema: Seq[Attribute], useSubexprElimination: Boolean = false): MutableProjection

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  86. def newNaturalAscendingOrdering(dataTypes: Seq[DataType]): Ordering[InternalRow]

    Permalink

    Creates a row ordering for the given schema, in natural ascending order.

    Creates a row ordering for the given schema, in natural ascending order.

    Attributes
    protected
    Definition Classes
    SparkPlan
  87. def newOrdering(order: Seq[SortOrder], inputSchema: Seq[Attribute]): Ordering[InternalRow]

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  88. def newPredicate(expression: Expression, inputSchema: Seq[Attribute]): Predicate

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  89. def nodeName: String

    Permalink
    Definition Classes
    TreeNode
  90. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  91. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  92. def numberedTreeString: String

    Permalink
    Definition Classes
    TreeNode
  93. val origin: Origin

    Permalink
    Definition Classes
    TreeNode
  94. def otherCopyArgs: Seq[AnyRef]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  95. def output: Seq[Attribute]

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan
  96. def outputOrdering: Seq[SortOrder]

    Permalink

    Specifies how data is ordered in each partition.

    Specifies how data is ordered in each partition.

    Definition Classes
    SparkPlan
  97. def outputPartitioning: Partitioning

    Permalink

    Specifies how data is partitioned across different nodes in the cluster.

    Specifies how data is partitioned across different nodes in the cluster.

    Definition Classes
    HashAggregateExecSparkPlan
  98. def outputSet: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  99. def p(number: Int): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  100. var parent: CodegenSupport

    Permalink

    Which SparkPlan is calling produce() of this one.

    Which SparkPlan is calling produce() of this one. It's itself for the first SparkPlan.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  101. def preCanonicalized: SparkPlan

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  102. final def prepare(): Unit

    Permalink

    Prepare a SparkPlan for execution.

    Prepare a SparkPlan for execution. It's idempotent.

    Definition Classes
    SparkPlan
  103. def prepareSubqueries(): Unit

    Permalink

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Attributes
    protected
    Definition Classes
    SparkPlan
  104. def prettyJson: String

    Permalink
    Definition Classes
    TreeNode
  105. def printSchema(): Unit

    Permalink
    Definition Classes
    QueryPlan
  106. final def produce(ctx: CodegenContext, parent: CodegenSupport): String

    Permalink

    Returns Java source code to process the rows from input RDD.

    Returns Java source code to process the rows from input RDD.

    Definition Classes
    CodegenSupport
  107. def producedAttributes: AttributeSet

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan
  108. def references: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  109. def requiredChildDistribution: List[Distribution]

    Permalink

    Specifies any partition requirements on the input data for this operator.

    Specifies any partition requirements on the input data for this operator.

    Definition Classes
    HashAggregateExecSparkPlan
  110. val requiredChildDistributionExpressions: Option[Seq[Expression]]

    Permalink
  111. def requiredChildOrdering: Seq[Seq[SortOrder]]

    Permalink

    Specifies sort order for each partition requirements on the input data for this operator.

    Specifies sort order for each partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  112. def resetMetrics(): Unit

    Permalink

    Reset all the metrics.

    Reset all the metrics.

    Definition Classes
    SparkPlan
  113. val resultExpressions: Seq[NamedExpression]

    Permalink
  114. final def sameResult(other: SparkPlan): Boolean

    Permalink
    Definition Classes
    QueryPlan
  115. lazy val schema: StructType

    Permalink
    Definition Classes
    QueryPlan
  116. def schemaString: String

    Permalink
    Definition Classes
    QueryPlan
  117. final def semanticHash(): Int

    Permalink
    Definition Classes
    QueryPlan
  118. val shouldStopRequired: Boolean

    Permalink

    Set to false if this plan consumes all rows produced by children but doesn't output row to buffer by calling append(), so the children don't require shouldStop() in the loop of producing rows.

    Set to false if this plan consumes all rows produced by children but doesn't output row to buffer by calling append(), so the children don't require shouldStop() in the loop of producing rows.

    Attributes
    protected
    Definition Classes
    HashAggregateExecCodegenSupport
  119. def simpleString: String

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan → TreeNode
  120. def sparkContext: SparkContext

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  121. final val sqlContext: SQLContext

    Permalink

    A handle to the SQL Context that was used to create this plan.

    A handle to the SQL Context that was used to create this plan. Since many operators need access to the sqlContext for RDD operations or configuration this field is automatically populated by the query planning infrastructure.

    Definition Classes
    SparkPlan
  122. def statePrefix: String

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  123. def stringArgs: Iterator[Any]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  124. val subexpressionEliminationEnabled: Boolean

    Permalink
    Definition Classes
    SparkPlan
  125. def subqueries: Seq[SparkPlan]

    Permalink
    Definition Classes
    QueryPlan
  126. def supportCodegen: Boolean

    Permalink

    Whether this SparkPlan support whole stage codegen or not.

    Whether this SparkPlan support whole stage codegen or not.

    Definition Classes
    HashAggregateExecCodegenSupport
  127. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  128. def toJSON: String

    Permalink
    Definition Classes
    TreeNode
  129. def toString(): String

    Permalink
    Definition Classes
    TreeNode → AnyRef → Any
  130. def transform(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  131. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  132. def transformDown(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  133. def transformExpressions(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  134. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  135. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  136. def transformUp(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  137. def treeString(verbose: Boolean, addSuffix: Boolean): String

    Permalink
    Definition Classes
    TreeNode
  138. def treeString: String

    Permalink
    Definition Classes
    TreeNode
  139. def usedInputs: AttributeSet

    Permalink

    The subset of inputSet those should be evaluated before this plan.

    The subset of inputSet those should be evaluated before this plan.

    We will use this to insert some code to access those columns that are actually used by current plan before calling doConsume().

    Definition Classes
    HashAggregateExecCodegenSupport
  140. def validConstraints: Set[Expression]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  141. def verboseString: String

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan → TreeNode
  142. def verboseStringWithSuffix: String

    Permalink
    Definition Classes
    TreeNode
  143. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  144. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  145. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  146. def waitForSubqueries(): Unit

    Permalink

    Blocks the thread until all subqueries finish evaluation and update the results.

    Blocks the thread until all subqueries finish evaluation and update the results.

    Attributes
    protected
    Definition Classes
    SparkPlan
  147. def withNewChildren(newChildren: Seq[SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode

Inherited from CodegenSupport

Inherited from UnaryExecNode

Inherited from SparkPlan

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from QueryPlan[SparkPlan]

Inherited from TreeNode[SparkPlan]

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped