org.apache.spark.sql.hive.test

TestHiveContext

class TestHiveContext extends HiveContext

A locally running test instance of Spark's Hive execution engine.

Data from testTables will be automatically loaded whenever a query is run over those tables. Calling reset will delete all tables and other state in the database, leaving the database in a "clean" state.

TestHive is singleton object version of this class because instantiating multiple copies of the hive metastore seems to lead to weird non-deterministic failures. Therefore, the execution of test cases that rely on TestHive must be serialized.

Self Type
TestHiveContext
Linear Supertypes
HiveContext, SQLContext, Serializable, Serializable, UDFRegistration, ExpressionConversions, SQLConf, Logging, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. TestHiveContext
  2. HiveContext
  3. SQLContext
  4. Serializable
  5. Serializable
  6. UDFRegistration
  7. ExpressionConversions
  8. SQLConf
  9. Logging
  10. AnyRef
  11. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TestHiveContext(sc: SparkContext)

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. class HiveQLQueryExecution extends QueryExecution

    Attributes
    protected[org.apache.spark.sql.hive]
  6. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  7. abstract class QueryExecution extends TestHiveContext.QueryExecution

    Override QueryExecution with special debug workflow.

  8. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. implicit class SqlCmd extends AnyRef

    Attributes
    protected[org.apache.spark.sql.hive]
  10. case class TestTable(name: String, commands: () ⇒ Unit*) extends Product with Serializable

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def analyze(tableName: String): Unit

    Analyzes the given table in the current database to generate statistics, which will be used in query optimizations.

    Analyzes the given table in the current database to generate statistics, which will be used in query optimizations.

    Right now, it only supports Hive tables and it only updates the size of a Hive table in the Hive metastore.

    Definition Classes
    HiveContext
  7. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  8. def applySchema(rowRDD: RDD[Row], schema: StructType): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  9. def approxCountDistinct(e: Expression, rsd: Double): ApproxCountDistinct

    Definition Classes
    ExpressionConversions
  10. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  11. def avg(e: Expression): Average

    Definition Classes
    ExpressionConversions
  12. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  13. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  14. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  15. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  16. var cacheTables: Boolean

  17. lazy val catalog: HiveMetastoreCatalog with OverrideCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  18. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. def configure(): Unit

    Sets up the system initially or after a RESET command

    Sets up the system initially or after a RESET command

    Attributes
    protected
  20. def count(e: Expression): Count

    Definition Classes
    ExpressionConversions
  21. def countDistinct(e: Expression*): CountDistinct

    Definition Classes
    ExpressionConversions
  22. def createParquetFile[A <: Product](path: String, allowExisting: Boolean, conf: Configuration)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  23. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
  24. def createTable[A <: Product](tableName: String, allowExisting: Boolean = true)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Unit

    Creates a table using the schema of the given class.

    Creates a table using the schema of the given class.

    A

    A case class that is used to describe the schema of the table to be created.

    tableName

    The name of the table to create.

    allowExisting

    When false, an exception will be thrown if the table already exists.

    Definition Classes
    HiveContext
  25. implicit def decimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  26. val describedTable: Regex

  27. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  28. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  29. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  30. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  31. def executePlan(plan: LogicalPlan): QueryExecution

    Definition Classes
    TestHiveContextHiveContext → SQLContext
  32. def executeSql(sql: String): TestHiveContext.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  33. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  34. def first(e: Expression): First

    Definition Classes
    ExpressionConversions
  35. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  36. lazy val functionRegistry: HiveFunctionRegistry with OverrideFunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  37. def getAllConfs: Map[String, String]

    Definition Classes
    SQLConf
  38. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  39. def getConf(key: String, defaultValue: String): String

    Definition Classes
    SQLConf
  40. def getConf(key: String): String

    Definition Classes
    SQLConf
  41. def getHiveFile(path: String): File

  42. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  43. lazy val hiveDevHome: Option[File]

    The location of the hive source code.

  44. val hiveFilesTemp: File

  45. lazy val hiveHome: Option[File]

    The location of the compiled hive distribution

  46. val hivePlanner: SparkPlanner with HiveStrategies

    Definition Classes
    HiveContext
  47. val hiveQTestUtilTables: Seq[TestTable]

  48. lazy val hiveconf: HiveConf

    SQLConf and HiveConf contracts:

    SQLConf and HiveConf contracts:

    1. reuse existing started SessionState if any 2. when the Hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  49. val inRepoTests: File

  50. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  51. def isCached(tableName: String): Boolean

    Definition Classes
    SQLContext
  52. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  53. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  54. def jsonFile(path: String, samplingRatio: Double): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  55. def jsonFile(path: String, schema: StructType): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  56. def jsonFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  57. def jsonRDD(json: RDD[String], samplingRatio: Double): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  58. def jsonRDD(json: RDD[String], schema: StructType): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  59. def jsonRDD(json: RDD[String]): SchemaRDD

    Definition Classes
    SQLContext
  60. def loadTestTable(name: String): Unit

  61. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  62. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  63. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  64. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  65. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  66. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  67. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  68. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  69. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  70. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  71. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  72. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  73. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  74. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  75. def lower(e: Expression): Lower

    Definition Classes
    ExpressionConversions
  76. def max(e: Expression): Max

    Definition Classes
    ExpressionConversions
  77. lazy val metastorePath: String

  78. def min(e: Expression): Min

    Definition Classes
    ExpressionConversions
  79. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  80. final def notify(): Unit

    Definition Classes
    AnyRef
  81. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  82. val optimizer: Optimizer.type

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  83. val originalUdfs: Set[String]

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Attributes
    protected
  84. lazy val outputBuffer: OutputStream { ... /* 4 definitions in type refinement */ }

    Attributes
    protected
    Definition Classes
    HiveContext
  85. def parquetFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  86. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  87. val parser: SqlParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  88. val planner: SparkPlanner with HiveStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  89. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  90. def registerFunction[T](name: String, func: Function22[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  91. def registerFunction[T](name: String, func: Function21[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  92. def registerFunction[T](name: String, func: Function20[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  93. def registerFunction[T](name: String, func: Function19[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  94. def registerFunction[T](name: String, func: Function18[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  95. def registerFunction[T](name: String, func: Function17[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  96. def registerFunction[T](name: String, func: Function16[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  97. def registerFunction[T](name: String, func: Function15[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  98. def registerFunction[T](name: String, func: Function14[_, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  99. def registerFunction[T](name: String, func: Function13[_, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  100. def registerFunction[T](name: String, func: Function12[_, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  101. def registerFunction[T](name: String, func: Function11[_, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  102. def registerFunction[T](name: String, func: Function10[_, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  103. def registerFunction[T](name: String, func: Function9[_, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  104. def registerFunction[T](name: String, func: Function8[_, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  105. def registerFunction[T](name: String, func: Function7[_, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  106. def registerFunction[T](name: String, func: Function6[_, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  107. def registerFunction[T](name: String, func: Function5[_, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  108. def registerFunction[T](name: String, func: Function4[_, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  109. def registerFunction[T](name: String, func: Function3[_, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  110. def registerFunction[T](name: String, func: Function2[_, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  111. def registerFunction[T](name: String, func: Function1[_, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  112. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Definition Classes
    SQLContext
  113. def registerTestTable(testTable: TestTable): HashMap[String, TestTable]

  114. def reset(): Unit

    Resets the test instance by deleting any tables that have been created.

    Resets the test instance by deleting any tables that have been created. TODO: also clear out UDFs, views, etc.

  115. def runHive(cmd: String, maxRows: Int = 1000): Seq[String]

    Execute the command using Hive and return the results as a sequence.

    Execute the command using Hive and return the results as a sequence. Each element in the sequence is one row.

    Attributes
    protected
    Definition Classes
    HiveContext
  116. def runSqlHive(sql: String): Seq[String]

    Runs the specified SQL query using Hive.

    Runs the specified SQL query using Hive.

    Definition Classes
    TestHiveContextHiveContext
  117. lazy val sessionState: SessionState

    SQLConf and HiveConf contracts:

    SQLConf and HiveConf contracts:

    1. reuse existing started SessionState if any 2. when the Hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  118. def setConf(key: String, value: String): Unit

    Definition Classes
    HiveContext → SQLConf
  119. def setConf(props: Properties): Unit

    Definition Classes
    SQLConf
  120. val settings: Map[String, String]

    Attributes
    protected[org.apache.spark]
    Definition Classes
    SQLConf
  121. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  122. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  123. def sql(sqlText: String): SchemaRDD

    Definition Classes
    HiveContext → SQLContext
  124. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  125. def sum(e: Expression): Sum

    Definition Classes
    ExpressionConversions
  126. def sumDistinct(e: Expression): SumDistinct

    Definition Classes
    ExpressionConversions
  127. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  128. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  129. def table(tableName: String): SchemaRDD

    Definition Classes
    SQLContext
  130. lazy val testTables: HashMap[String, TestTable]

    A list of test tables and the DDL required to initialize them.

    A list of test tables and the DDL required to initialize them. A test table is loaded on demand when a query are run against it.

  131. val testTempDir: File

  132. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  133. def toString(): String

    Definition Classes
    AnyRef → Any
  134. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  135. def upper(e: Expression): Upper

    Definition Classes
    ExpressionConversions
  136. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  137. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  138. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  139. lazy val warehousePath: String

Deprecated Value Members

  1. def hiveql(hqlQuery: String): SchemaRDD

    Definition Classes
    HiveContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.1)

  2. def hql(hqlQuery: String): SchemaRDD

    Definition Classes
    HiveContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.1)

Inherited from HiveContext

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from UDFRegistration

Inherited from ExpressionConversions

Inherited from SQLConf

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped