Class

be.dataminded.lighthouse.datalake

JdbcDataLink

Related Doc: package datalake

Permalink

class JdbcDataLink extends DataLink

Default JDBC DataRef implementation for reading and writing to a JDBC database

Linear Supertypes
DataLink, SparkSessions, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. JdbcDataLink
  2. DataLink
  3. SparkSessions
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new JdbcDataLink(url: LazyConfig[String], username: LazyConfig[String], password: LazyConfig[String], driver: LazyConfig[String], table: LazyConfig[String], extraProperties: LazyConfig[Map[String, String]] = ..., partitionColumn: LazyConfig[String] = "", numberOfPartitions: LazyConfig[Int] = 0, batchSize: LazyConfig[Int] = 50000, saveMode: SaveMode = SaveMode.Append)

    Permalink

    url

    Function returning the URL of the database you want to connect to. Should be in the following format jdbc:mysql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}

    username

    Function returning the Username of the database you want to connect to

    password

    Function returning the Password of the database you want to connect to

    driver

    Function returning the Driver to use for the database you want to connect to, should be available in the classpath

    table

    Function returning the Table of the database where you would like to write to.

    extraProperties

    Additional properties to use to connect to the database

    partitionColumn

    The column where you want to partition your data for, should contain an Integer type

    numberOfPartitions

    Amount of partitions you want to use for reading or writing your data. If value is 0 then batchSize is taken to decide the number of partitions. If both numberOfPartitions and batchSize is not 0, numberOfPartitions takes preference

    batchSize

    The amount of rows that you want to retrieve in one partition. If value is 0 number of partitions is taken to decide the batch size

    saveMode

    Spark sql SaveMode

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. lazy val connectionProperties: Map[String, String]

    Permalink
  7. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  9. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  11. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  12. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. def read(): DataFrame

    Permalink
    Definition Classes
    JdbcDataLinkDataLink
  17. def readAs[T]()(implicit arg0: Encoder[T]): Dataset[T]

    Permalink
    Definition Classes
    DataLink
  18. lazy val spark: SparkSession

    Permalink
    Definition Classes
    SparkSessions
  19. val sparkOptions: Map[String, String]

    Permalink

    Custom Spark configuration for the job.

    Custom Spark configuration for the job.

    Definition Classes
    SparkSessions
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. def write[T](dataset: Dataset[T]): Unit

    Permalink
    Definition Classes
    JdbcDataLinkDataLink

Inherited from DataLink

Inherited from SparkSessions

Inherited from AnyRef

Inherited from Any

Ungrouped