Class

com.exasol.spark

DefaultSource

Related Doc: package spark

Permalink

class DefaultSource extends RelationProvider with DataSourceRegister with SchemaRelationProvider with CreatableRelationProvider

The default entry source for creating integration between Exasol and Spark.

Additionally, it serves as a factory class to create ExasolRelation instances for Spark application.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DefaultSource
  2. CreatableRelationProvider
  3. SchemaRelationProvider
  4. DataSourceRegister
  5. RelationProvider
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DefaultSource()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def createRelation(sqlContext: SQLContext, mode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation

    Permalink

    Creates an ExasolRelation after saving a org.apache.spark.sql.DataFrame into Exasol table.

    Creates an ExasolRelation after saving a org.apache.spark.sql.DataFrame into Exasol table.

    sqlContext

    A Spark org.apache.spark.sql.SQLContext context

    mode

    One of Spark save modes, org.apache.spark.sql.SaveMode

    parameters

    The parameters provided as options, table parameter is required for write

    data

    A Spark org.apache.spark.sql.DataFrame to save as a Exasol table

    returns

    An ExasolRelation relation

    Definition Classes
    DefaultSource → CreatableRelationProvider
  7. def createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation

    Permalink

    Creates an ExasolRelation using the provided Spark org.apache.spark.sql.SQLContext, parameters and schema.

    Creates an ExasolRelation using the provided Spark org.apache.spark.sql.SQLContext, parameters and schema.

    sqlContext

    A Spark org.apache.spark.sql.SQLContext context

    parameters

    The parameters provided as options, query parameter is required for read

    schema

    A user provided schema used to select columns for the relation

    returns

    An ExasolRelation relation

    Definition Classes
    DefaultSource → SchemaRelationProvider
  8. def createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation

    Permalink

    Creates an ExasolRelation using provided Spark org.apache.spark.sql.SQLContext and parameters.

    Creates an ExasolRelation using provided Spark org.apache.spark.sql.SQLContext and parameters.

    Since the schema is not provided, it is inferred by running an Exasol query with LIMIT 1 clause.

    sqlContext

    A Spark org.apache.spark.sql.SQLContext context

    parameters

    The parameters provided as options, query parameter is required for read

    returns

    An ExasolRelation relation

    Definition Classes
    DefaultSource → RelationProvider
  9. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  14. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  15. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  16. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  17. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. def repartitionPerNode(df: DataFrame, nodesCnt: Int): DataFrame

    Permalink

    Rearrange dataframe partitions into Exasol data nodes count.

    Rearrange dataframe partitions into Exasol data nodes count.

    If nodesCnt < df.rdd.getNumPartitions then perform

    df.coalesce(nodesCnt)

    in order to reduce the partition counts.

    If nodesCnt > df.rdd.getNumPartitions then perform

    df.repartition(nodesCnt)

    so that there a partition for each data node.

    If the number of partitions and nodes are same, then do nothing.

  19. def shortName(): String

    Permalink
    Definition Classes
    DefaultSource → DataSourceRegister
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from CreatableRelationProvider

Inherited from SchemaRelationProvider

Inherited from DataSourceRegister

Inherited from RelationProvider

Inherited from AnyRef

Inherited from Any

Ungrouped