Trait

com.coxautodata.waimak.metastore

HadoopDBConnector

Related Doc: package metastore

Permalink

trait HadoopDBConnector extends DBConnector

Hadoop database connection trait that has Hadoop-specific table functions (i.e. create parquet tables)

Linear Supertypes
DBConnector, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HadoopDBConnector
  2. DBConnector
  3. Logging
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def forceRecreateTables: Boolean

    Permalink

    Force drop+create of tables even if update is called (necessary in cases of schema change)

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  13. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  14. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  15. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  16. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  17. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  18. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  19. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  20. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  21. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  22. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  23. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  25. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. def recreateTableFromParquetDDLs(tableName: String, path: String, partitionColumns: Seq[String] = Seq.empty): Seq[String]

    Permalink

    Recreate a table from parquet files, inferring the schema from the parquet.

    Recreate a table from parquet files, inferring the schema from the parquet. Table is dropped if it exists and then recreated.

    tableName

    name of the table

    path

    path of the table location

    partitionColumns

    optional list of partition columns

    returns

    the sql statements which need executing to perform the table recreation

  28. def submitAtomicResultlessQueries(ddls: Seq[String]): Unit

    Permalink
    Definition Classes
    DBConnector
  29. def submitResultlessQuery(ddl: String): Unit

    Permalink

    Submit a query that returns no results (i.e.

    Submit a query that returns no results (i.e. schema change operations) Exceptions will be thrown if the query fails

    ddl

    SQL ddl as a string

    Definition Classes
    DBConnector
  30. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  31. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  32. def updateTableParquetLocationDDLs(tableName: String, path: String, partitionColumns: Seq[String] = Seq.empty): Seq[String]

    Permalink

    Update the data location of a parquet table.

    Update the data location of a parquet table. If the table is partitioned, it will be dropped if it exists and recreated. If the table is not partitioned and forceRecreateTables is true, it will be dropped if it exists and recreated. If the table is not partitioned and forceRecreateTables is false, then the data location will be changed without dropping the table. The table will be created if it does not already exist.

    tableName

    name of the table

    path

    path of the table location

    partitionColumns

    optional list of partition columns

    returns

    the sql statements which need executing to perform the table update

  33. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from DBConnector

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped