org.apache.spark.sql.internal

BaseSessionStateBuilder

abstract class BaseSessionStateBuilder extends AnyRef

Builder class that coordinates construction of a new SessionState.

The builder explicitly defines all components needed by the session state, and creates a session state when build is called. Components should only be initialized once. This is not a problem for most components as they are only used in the build function. However some components (conf, catalog, functionRegistry, experimentalMethods & sqlParser) are as dependencies for other components and are shared as a result. These components are defined as lazy vals to make sure the component is created only once.

A developer can modify the builder by providing custom versions of components, or by using the hooks provided for the analyzer, optimizer & planner. There are some dependencies between the components (they are documented per dependency), a developer should respect these when making modifications in order to prevent initialization problems.

A parent SessionState can be used to initialize the new SessionState. The new session state will clone the parent sessions state's conf, functionRegistry, experimentalMethods and catalog fields. Note that the state is cloned when build is called, and not before.

Annotations
@Experimental() @Unstable()
Linear Supertypes
AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. BaseSessionStateBuilder
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new BaseSessionStateBuilder(session: SparkSession, parentState: Option[SessionState] = scala.None)

Type Members

  1. type NewBuilder = (SparkSession, Option[SessionState]) ⇒ BaseSessionStateBuilder

Abstract Value Members

  1. abstract def newBuilder: (SparkSession, Option[SessionState]) ⇒ BaseSessionStateBuilder

    Function that produces a new instance of the SessionStateBuilder.

    Function that produces a new instance of the SessionStateBuilder. This is used by the SessionState's clone functionality. Make sure to override this when implementing your own SessionStateBuilder.

    Attributes
    protected

Concrete Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def analyzer: Analyzer

    Logical query plan analyzer for resolving unresolved attributes and relations.

    Logical query plan analyzer for resolving unresolved attributes and relations.

    Note: this depends on the conf and catalog fields.

    Attributes
    protected
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def build(): SessionState

    Build the SessionState.

  9. lazy val catalog: SessionCatalog

    Catalog for managing table and database states.

    Catalog for managing table and database states. If there is a pre-existing catalog, the state of that catalog (temp tables & current database) will be copied into the new catalog.

    Note: this depends on the conf, functionRegistry and sqlParser fields.

    Attributes
    protected
  10. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  11. lazy val conf: SQLConf

    SQL-specific key-value configurations.

    SQL-specific key-value configurations.

    These either get cloned from a pre-existing instance or newly created. The conf is always merged with its SparkConf.

    Attributes
    protected
  12. def createClone: (SparkSession, SessionState) ⇒ SessionState

    Function used to make clones of the session state.

    Function used to make clones of the session state.

    Attributes
    protected
  13. def createQueryExecution: (LogicalPlan) ⇒ QueryExecution

    Create a query execution object.

    Create a query execution object.

    Attributes
    protected
  14. def customCheckRules: Seq[(LogicalPlan) ⇒ Unit]

    Custom check rules to add to the Analyzer.

    Custom check rules to add to the Analyzer. Prefer overriding this instead of creating your own Analyzer.

    Note that this may NOT depend on the analyzer function.

    Attributes
    protected
  15. def customOperatorOptimizationRules: Seq[Rule[LogicalPlan]]

    Custom operator optimization rules to add to the Optimizer.

    Custom operator optimization rules to add to the Optimizer. Prefer overriding this instead of creating your own Optimizer.

    Note that this may NOT depend on the optimizer function.

    Attributes
    protected
  16. def customPlanningStrategies: Seq[Strategy]

    Custom strategies to add to the planner.

    Custom strategies to add to the planner. Prefer overriding this instead of creating your own Planner.

    Note that this may NOT depend on the planner function.

    Attributes
    protected
  17. def customPostHocResolutionRules: Seq[Rule[LogicalPlan]]

    Custom post resolution rules to add to the Analyzer.

    Custom post resolution rules to add to the Analyzer. Prefer overriding this instead of creating your own Analyzer.

    Note that this may NOT depend on the analyzer function.

    Attributes
    protected
  18. def customResolutionRules: Seq[Rule[LogicalPlan]]

    Custom resolution rules to add to the Analyzer.

    Custom resolution rules to add to the Analyzer. Prefer overriding this instead of creating your own Analyzer.

    Note that this may NOT depend on the analyzer function.

    Attributes
    protected
  19. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  20. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  21. lazy val experimentalMethods: ExperimentalMethods

    Experimental methods that can be used to define custom optimization rules and custom planning strategies.

    Experimental methods that can be used to define custom optimization rules and custom planning strategies.

    This either gets cloned from a pre-existing version or newly created.

    Attributes
    protected
  22. def extensions: SparkSessionExtensions

    Session extensions defined in the SparkSession.

    Session extensions defined in the SparkSession.

    Attributes
    protected
  23. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  24. lazy val functionRegistry: FunctionRegistry

    Internal catalog managing functions registered by the user.

    Internal catalog managing functions registered by the user.

    This either gets cloned from a pre-existing version or cloned from the built-in registry.

    Attributes
    protected
  25. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  26. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  27. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  28. def listenerManager: ExecutionListenerManager

    An interface to register custom org.apache.spark.sql.util.QueryExecutionListeners that listen for execution metrics.

    An interface to register custom org.apache.spark.sql.util.QueryExecutionListeners that listen for execution metrics.

    This gets cloned from parent if available, otherwise is a new instance is created.

    Attributes
    protected
  29. def mergeSparkConf(sqlConf: SQLConf, sparkConf: SparkConf): Unit

    Extract entries from SparkConf and put them in the SQLConf

    Extract entries from SparkConf and put them in the SQLConf

    Attributes
    protected
  30. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  31. final def notify(): Unit

    Definition Classes
    AnyRef
  32. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  33. def optimizer: Optimizer

    Logical query plan optimizer.

    Logical query plan optimizer.

    Note: this depends on the conf, catalog and experimentalMethods fields.

    Attributes
    protected
  34. val parentState: Option[SessionState]

  35. def planner: SparkPlanner

    Planner that converts optimized logical plans to physical plans.

    Planner that converts optimized logical plans to physical plans.

    Note: this depends on the conf and experimentalMethods fields.

    Attributes
    protected
  36. lazy val resourceLoader: SessionResourceLoader

    ResourceLoader that is used to load function resources and jars.

    ResourceLoader that is used to load function resources and jars.

    Attributes
    protected
  37. val session: SparkSession

  38. lazy val sqlParser: ParserInterface

    Parser that extracts expressions, plans, table identifiers etc.

    Parser that extracts expressions, plans, table identifiers etc. from SQL texts.

    Note: this depends on the conf field.

    Attributes
    protected
  39. def streamingQueryManager: StreamingQueryManager

    Interface to start and stop streaming queries.

    Interface to start and stop streaming queries.

    Attributes
    protected
  40. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  41. def toString(): String

    Definition Classes
    AnyRef → Any
  42. def udfRegistration: UDFRegistration

    Interface exposed to the user for registering user-defined functions.

    Interface exposed to the user for registering user-defined functions.

    Note 1: The user-defined functions must be deterministic. Note 2: This depends on the functionRegistry field.

    Attributes
    protected
  43. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped