org.apache.spark.sql.hive.test

TestHive

object TestHive extends TestHiveContext

Linear Supertypes
TestHiveContext, HiveContext, SQLContext, Serializable, Serializable, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. TestHive
  2. TestHiveContext
  3. HiveContext
  4. SQLContext
  5. Serializable
  6. Serializable
  7. Logging
  8. AnyRef
  9. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. class HiveQLQueryExecution extends QueryExecution

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    TestHiveContext
  2. class QueryExecution extends TestHiveContext.QueryExecution

    Override QueryExecution with special debug workflow.

  3. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  4. implicit class SqlCmd extends AnyRef

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    TestHiveContext
  5. case class TestTable(name: String, commands: () ⇒ Unit*) extends Product with Serializable

    Definition Classes
    TestHiveContext

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def analyze(tableName: String): Unit

    Analyzes the given table in the current database to generate statistics, which will be used in query optimizations.

    Analyzes the given table in the current database to generate statistics, which will be used in query optimizations.

    Right now, it only supports Hive tables and it only updates the size of a Hive table in the Hive metastore.

    Definition Classes
    HiveContext
    Annotations
    @Experimental()
  7. lazy val analyzer: Analyzer { val extendedResolutionRules: List[org.apache.spark.sql.catalyst.rules.Rule[org.apache.spark.sql.catalyst.plans.logical.LogicalPlan]] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  8. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schema: StructType): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schemaString: String): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  10. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  11. def baseRelationToDataFrame(baseRelation: BaseRelation): DataFrame

    Definition Classes
    SQLContext
  12. val cacheManager: CacheManager

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  13. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  14. var cacheTables: Boolean

    Definition Classes
    TestHiveContext
  15. lazy val catalog: HiveMetastoreCatalog with OverrideCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  16. lazy val checkAnalysis: CheckAnalysis { val extendedCheckRules: Seq[org.apache.spark.sql.sources.PreWriteCheck] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  17. def clearCache(): Unit

    Definition Classes
    SQLContext
  18. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. lazy val conf: SQLConf

    Fewer partitions to speed up testing.

    Fewer partitions to speed up testing.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    TestHiveContextHiveContext → SQLContext
  20. def configure(): Unit

    Sets up the system initially or after a RESET command

    Sets up the system initially or after a RESET command

    Attributes
    protected
    Definition Classes
    TestHiveContext
  21. def convertCTAS: Boolean

    When true, a table created by a Hive CTAS statement (no USING clause) will be converted to a data source table, using the data source set by spark.

    When true, a table created by a Hive CTAS statement (no USING clause) will be converted to a data source table, using the data source set by spark.sql.sources.default. The table in CTAS statement will be converted when it meets any of the following conditions:

    • The CTAS does not specify any of a SerDe (ROW FORMAT SERDE), a File Format (STORED AS), or a Storage Hanlder (STORED BY), and the value of hive.default.fileformat in hive-site.xml is either TextFile or SequenceFile.
    • The CTAS statement specifies TextFile (STORED AS TEXTFILE) as the file format and no SerDe is specified (no ROW FORMAT SERDE clause).
    • The CTAS statement specifies SequenceFile (STORED AS SEQUENCEFILE) as the file format and no SerDe is specified (no ROW FORMAT SERDE clause).
    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext
  22. def convertMetastoreParquet: Boolean

    When true, enables an experimental feature where metastore tables that use the parquet SerDe are automatically converted to use the Spark SQL parquet table scan, instead of the Hive SerDe.

    When true, enables an experimental feature where metastore tables that use the parquet SerDe are automatically converted to use the Spark SQL parquet table scan, instead of the Hive SerDe.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext
  23. def createDataFrame(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  24. def createDataFrame(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  25. def createDataFrame(rowRDD: JavaRDD[Row], columns: List[String]): DataFrame

    Definition Classes
    SQLContext
  26. def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  27. def createDataFrame(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  28. def createDataFrame[A <: Product](data: Seq[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  29. def createDataFrame[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  30. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  31. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  32. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  33. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  34. def createExternalTable(tableName: String, path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  35. def createExternalTable(tableName: String, path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  36. val ddlParser: DDLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  37. val ddlParserWithHiveQL: DDLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext
  38. val describedTable: Regex

    Definition Classes
    TestHiveContext
  39. def dropTempTable(tableName: String): Unit

    Definition Classes
    SQLContext
  40. lazy val emptyDataFrame: DataFrame

    Definition Classes
    SQLContext
  41. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  42. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  43. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  44. def executePlan(plan: LogicalPlan): QueryExecution

    Definition Classes
    TestHiveContextHiveContext → SQLContext
  45. def executeSql(sql: String): TestHive.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  46. val experimental: ExperimentalMethods

    Definition Classes
    SQLContext
  47. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  48. lazy val functionRegistry: HiveFunctionRegistry with OverrideFunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  49. def getAllConfs: Map[String, String]

    Definition Classes
    SQLContext
  50. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  51. def getConf(key: String, defaultValue: String): String

    Definition Classes
    SQLContext
  52. def getConf(key: String): String

    Definition Classes
    SQLContext
  53. def getHiveFile(path: String): File

    Definition Classes
    TestHiveContext
  54. def getSchema(beanClass: Class[_]): Seq[AttributeReference]

    Attributes
    protected
    Definition Classes
    SQLContext
  55. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  56. lazy val hiveDevHome: Option[File]

    The location of the hive source code.

    The location of the hive source code.

    Definition Classes
    TestHiveContext
  57. val hiveFilesTemp: File

    Definition Classes
    TestHiveContext
  58. lazy val hiveHome: Option[File]

    The location of the compiled hive distribution

    The location of the compiled hive distribution

    Definition Classes
    TestHiveContext
  59. val hiveQTestUtilTables: Seq[TestTable]

    Definition Classes
    TestHiveContext
  60. lazy val hiveconf: HiveConf

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  61. val inRepoTests: File

    Definition Classes
    TestHiveContext
  62. def invalidateTable(tableName: String): Unit

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  63. def isCached(tableName: String): Boolean

    Definition Classes
    SQLContext
  64. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  65. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  66. def jdbc(url: String, table: String, theParts: Array[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  67. def jdbc(url: String, table: String, columnName: String, lowerBound: Long, upperBound: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  68. def jdbc(url: String, table: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  69. def jsonFile(path: String, samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  70. def jsonFile(path: String, schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  71. def jsonFile(path: String): DataFrame

    Definition Classes
    SQLContext
  72. def jsonRDD(json: JavaRDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  73. def jsonRDD(json: RDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  74. def jsonRDD(json: JavaRDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  75. def jsonRDD(json: RDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  76. def jsonRDD(json: JavaRDD[String]): DataFrame

    Definition Classes
    SQLContext
  77. def jsonRDD(json: RDD[String]): DataFrame

    Definition Classes
    SQLContext
  78. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  79. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  80. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  81. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  82. def load(path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  83. def load(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  84. def loadTestTable(name: String): Unit

    Definition Classes
    TestHiveContext
  85. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  86. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  87. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  88. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  89. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  90. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  91. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  92. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  93. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  94. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  95. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  96. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  97. lazy val metastorePath: String

    Definition Classes
    TestHiveContext
  98. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  99. final def notify(): Unit

    Definition Classes
    AnyRef
  100. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  101. lazy val optimizer: Optimizer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  102. val originalUdfs: Set[String]

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Attributes
    protected
    Definition Classes
    TestHiveContext
  103. lazy val outputBuffer: OutputStream { ... /* 4 definitions in type refinement */ }

    Attributes
    protected
    Definition Classes
    HiveContext
  104. def parquetFile(paths: String*): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @varargs()
  105. def parseDataType(dataTypeString: String): DataType

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  106. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  107. val planner: SparkPlanner with HiveStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  108. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  109. def refreshTable(tableName: String): Unit

    Invalidate and refresh all the cached the metadata of the given table.

    Invalidate and refresh all the cached the metadata of the given table. For performance reasons, Spark SQL or the external data source library it uses might cache certain metadata about a table, such as the location of blocks. When those change outside of Spark SQL, users should call this function to invalidate the cache.

    Definition Classes
    HiveContext
  110. def registerTestTable(testTable: TestTable): HashMap[String, TestTable]

    Definition Classes
    TestHiveContext
  111. def reset(): Unit

    Resets the test instance by deleting any tables that have been created.

    Resets the test instance by deleting any tables that have been created. TODO: also clear out UDFs, views, etc.

    Definition Classes
    TestHiveContext
  112. def runHive(cmd: String, maxRows: Int = 1000): Seq[String]

    Execute the command using Hive and return the results as a sequence.

    Execute the command using Hive and return the results as a sequence. Each element in the sequence is one row.

    Attributes
    protected
    Definition Classes
    HiveContext
  113. def runSqlHive(sql: String): Seq[String]

    Runs the specified SQL query using Hive.

    Runs the specified SQL query using Hive.

    Definition Classes
    TestHiveContextHiveContext
  114. lazy val sessionState: SessionState

    SQLConf and HiveConf contracts:

    SQLConf and HiveConf contracts:

    1. reuse existing started SessionState if any 2. when the Hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  115. def setConf(key: String, value: String): Unit

    Definition Classes
    HiveContext → SQLContext
  116. def setConf(props: Properties): Unit

    Definition Classes
    SQLContext
  117. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  118. def sql(sqlText: String): DataFrame

    Definition Classes
    HiveContext → SQLContext
  119. val sqlParser: SparkSQLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  120. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  121. def table(tableName: String): DataFrame

    Definition Classes
    SQLContext
  122. def tableNames(databaseName: String): Array[String]

    Definition Classes
    SQLContext
  123. def tableNames(): Array[String]

    Definition Classes
    SQLContext
  124. def tables(databaseName: String): DataFrame

    Definition Classes
    SQLContext
  125. def tables(): DataFrame

    Definition Classes
    SQLContext
  126. lazy val testTables: HashMap[String, TestTable]

    A list of test tables and the DDL required to initialize them.

    A list of test tables and the DDL required to initialize them. A test table is loaded on demand when a query are run against it.

    Definition Classes
    TestHiveContext
  127. val testTempDir: File

    Definition Classes
    TestHiveContext
  128. def toString(): String

    Definition Classes
    AnyRef → Any
  129. val udf: UDFRegistration

    Definition Classes
    SQLContext
  130. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  131. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  132. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  133. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  134. lazy val warehousePath: String

    Definition Classes
    TestHiveContext

Deprecated Value Members

  1. def applySchema(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  2. def applySchema(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  3. def applySchema(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

  4. def applySchema(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use createDataFrame

Inherited from TestHiveContext

Inherited from HiveContext

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped