org.apache.spark.sql.hive.test

TestHiveContext

class TestHiveContext extends LocalHiveContext

A locally running test instance of Spark's Hive execution engine.

Data from testTables will be automatically loaded whenever a query is run over those tables. Calling reset will delete all tables and other state in the database, leaving the database in a "clean" state.

TestHive is singleton object version of this class because instantiating multiple copies of the hive metastore seems to lead to weird non-deterministic failures. Therefore, the execution of test cases that rely on TestHive must be serialized.

Self Type
TestHiveContext
Linear Supertypes
LocalHiveContext, HiveContext, SQLContext, Serializable, Serializable, ExpressionConversions, com.typesafe.scalalogging.slf4j.Logging, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. TestHiveContext
  2. LocalHiveContext
  3. HiveContext
  4. SQLContext
  5. Serializable
  6. Serializable
  7. ExpressionConversions
  8. Logging
  9. AnyRef
  10. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TestHiveContext(sc: SparkContext)

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. class HiveQLQueryExecution extends QueryExecution

    Attributes
    protected[org.apache.spark.sql.hive]
  6. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  7. abstract class QueryExecution extends TestHiveContext.QueryExecution

    Override QueryExecution with special debug workflow.

  8. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. implicit class SqlCmd extends AnyRef

    Attributes
    protected[org.apache.spark.sql.hive]
  10. case class TestTable(name: String, commands: () ⇒ Unit*) extends Product with Serializable

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  9. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  10. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  11. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  12. var cacheTables: Boolean

  13. lazy val catalog: HiveMetastoreCatalog with OverrideCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  14. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  15. def configure(): Unit

    Sets up the system initially or after a RESET command

    Sets up the system initially or after a RESET command

    Attributes
    protected
    Definition Classes
    LocalHiveContext
  16. def createParquetFile[A <: Product](path: String, allowExisting: Boolean, conf: Configuration)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  17. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
  18. def createTable[A <: Product](tableName: String, allowExisting: Boolean = true)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Unit

    Creates a table using the schema of the given class.

    Creates a table using the schema of the given class.

    A

    A case class that is used to describe the schema of the table to be created.

    tableName

    The name of the table to create.

    allowExisting

    When false, an exception will be thrown if the table already exists.

    Definition Classes
    HiveContext
  19. implicit def decimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  20. val describedTable: Regex

  21. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  22. lazy val emptyResult: RDD[Row]

    Attributes
    protected
    Definition Classes
    HiveContext
  23. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  24. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  25. def executePlan(plan: LogicalPlan): QueryExecution

    Definition Classes
    TestHiveContextHiveContext → SQLContext
  26. def executeSql(sql: String): TestHiveContext.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  27. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  28. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  29. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  30. def getHiveFile(path: String): File

  31. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  32. lazy val hiveDevHome: Option[File]

    The location of the hive source code.

  33. val hiveFilesTemp: File

  34. lazy val hiveHome: Option[File]

    The location of the compiled hive distribution

  35. val hivePlanner: SparkPlanner with HiveStrategies

    Definition Classes
    HiveContext
  36. val hiveQTestUtilTables: Seq[TestTable]

  37. lazy val hiveconf: HiveConf

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  38. def hiveql(hqlQuery: String): SchemaRDD

    Executes a query expressed in HiveQL using Spark, returning the result as a SchemaRDD.

    Executes a query expressed in HiveQL using Spark, returning the result as a SchemaRDD.

    Definition Classes
    HiveContext
  39. def hql(hqlQuery: String): SchemaRDD

    An alias for hiveql.

    An alias for hiveql.

    Definition Classes
    HiveContext
  40. val inRepoTests: File

  41. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  42. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  43. def loadTestTable(name: String): Unit

  44. lazy val logger: Logger

    Attributes
    protected
    Definition Classes
    Logging
  45. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  46. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  47. lazy val metastorePath: String

    Definition Classes
    TestHiveContextLocalHiveContext
  48. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  49. final def notify(): Unit

    Definition Classes
    AnyRef
  50. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  51. val optimizer: Optimizer.type

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  52. val originalUdfs: Set[String]

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Attributes
    protected
  53. val outputBuffer: OutputStream { ... /* 4 definitions in type refinement */ }

    Attributes
    protected
    Definition Classes
    HiveContext
  54. def parquetFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  55. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  56. val parser: SqlParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  57. val planner: SparkPlanner with HiveStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  58. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  59. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Definition Classes
    SQLContext
  60. def registerTestTable(testTable: TestTable): HashMap[String, TestTable]

  61. def reset(): Unit

    Resets the test instance by deleting any tables that have been created.

    Resets the test instance by deleting any tables that have been created. TODO: also clear out UDFs, views, etc.

  62. def runHive(cmd: String, maxRows: Int = 1000): Seq[String]

    Execute the command using Hive and return the results as a sequence.

    Execute the command using Hive and return the results as a sequence. Each element in the sequence is one row.

    Attributes
    protected
    Definition Classes
    HiveContext
  63. def runSqlHive(sql: String): Seq[String]

    Runs the specified SQL query using Hive.

    Runs the specified SQL query using Hive.

    Definition Classes
    TestHiveContextHiveContext
  64. lazy val sessionState: SessionState

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  65. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  66. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  67. def sql(sqlText: String): SchemaRDD

    Definition Classes
    SQLContext
  68. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  69. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  70. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  71. def table(tableName: String): SchemaRDD

    Definition Classes
    SQLContext
  72. lazy val testTables: HashMap[String, TestTable]

    A list of test tables and the DDL required to initialize them.

    A list of test tables and the DDL required to initialize them. A test table is loaded on demand when a query are run against it.

  73. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  74. def toString(): String

    Definition Classes
    AnyRef → Any
  75. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  76. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  77. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  78. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  79. lazy val warehousePath: String

    Definition Classes
    TestHiveContextLocalHiveContext

Inherited from LocalHiveContext

Inherited from HiveContext

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from ExpressionConversions

Inherited from com.typesafe.scalalogging.slf4j.Logging

Inherited from AnyRef

Inherited from Any

Ungrouped