org.apache.spark.sql.execution

SparkStrategies

abstract class SparkStrategies extends QueryPlanner[SparkPlan]

Self Type
SparkPlanner
Linear Supertypes
QueryPlanner[SparkPlan], AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkStrategies
  2. QueryPlanner
  3. AnyRef
  4. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkStrategies()

Abstract Value Members

  1. abstract def collectPlaceholders(plan: SparkPlan): Seq[(SparkPlan, LogicalPlan)]

    Attributes
    protected
    Definition Classes
    QueryPlanner
  2. abstract def prunePlans(plans: Iterator[SparkPlan]): Iterator[SparkPlan]

    Attributes
    protected
    Definition Classes
    QueryPlanner
  3. abstract def strategies: Seq[GenericStrategy[SparkPlan]]

    Definition Classes
    QueryPlanner

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. object Aggregation extends Strategy

    Used to plan the aggregate operator for expressions based on the AggregateFunction2 interface.

  7. object BasicOperators extends Strategy

  8. object DDLStrategy extends Strategy

  9. object InMemoryScans extends Strategy

  10. object JoinSelection extends Strategy with PredicateHelper

    Select the proper physical plan for join based on joining keys and size of logical plan.

  11. object SpecialLimits extends Strategy

    Plans special cases of limit operators.

  12. object StatefulAggregationStrategy extends Strategy

    Used to plan aggregation queries that are computed incrementally as part of a StreamingQuery.

  13. object StreamingRelationStrategy extends Strategy

    This strategy is just for explaining Dataset/DataFrame created by spark.readStream.

  14. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  18. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  19. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  20. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  21. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  22. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  23. final def notify(): Unit

    Definition Classes
    AnyRef
  24. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  25. def plan(plan: LogicalPlan): Iterator[SparkPlan]

    Definition Classes
    QueryPlanner
  26. lazy val singleRowRdd: RDD[InternalRow]

    Attributes
    protected
  27. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  28. def toString(): String

    Definition Classes
    AnyRef → Any
  29. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from QueryPlanner[SparkPlan]

Inherited from AnyRef

Inherited from Any

Ungrouped