org.apache.spark.sql.hive

LocalHiveContext

class LocalHiveContext extends HiveContext

Starts up an instance of hive where metadata is stored locally. An in-process metadata data is created with data stored in ./metadata. Warehouse data is stored in in ./warehouse.

Linear Supertypes
HiveContext, SQLContext, Serializable, Serializable, ExpressionConversions, SQLConf, com.typesafe.scalalogging.slf4j.Logging, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. LocalHiveContext
  2. HiveContext
  3. SQLContext
  4. Serializable
  5. Serializable
  6. ExpressionConversions
  7. SQLConf
  8. Logging
  9. AnyRef
  10. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new LocalHiveContext(sc: SparkContext)

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  6. abstract class QueryExecution extends HiveContext.QueryExecution

    Extends QueryExecution with hive specific features.

  7. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def avg(e: Expression): Average

    Definition Classes
    ExpressionConversions
  9. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  10. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  11. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  12. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  13. lazy val catalog: HiveMetastoreCatalog with OverrideCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  14. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  15. def configure(): Unit

    Sets up the system initially or after a RESET command

    Sets up the system initially or after a RESET command

    Attributes
    protected
  16. def contains(key: String): Boolean

    Definition Classes
    SQLConf
  17. def count(e: Expression): Count

    Definition Classes
    ExpressionConversions
  18. def countDistinct(e: Expression*): CountDistinct

    Definition Classes
    ExpressionConversions
  19. def createParquetFile[A <: Product](path: String, allowExisting: Boolean, conf: Configuration)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  20. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
  21. def createTable[A <: Product](tableName: String, allowExisting: Boolean = true)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Unit

    Creates a table using the schema of the given class.

    Creates a table using the schema of the given class.

    A

    A case class that is used to describe the schema of the table to be created.

    tableName

    The name of the table to create.

    allowExisting

    When false, an exception will be thrown if the table already exists.

    Definition Classes
    HiveContext
  22. implicit def decimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  23. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  24. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  25. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  26. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  27. def executePlan(plan: LogicalPlan): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  28. def executeSql(sql: String): LocalHiveContext.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  29. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  30. def first(e: Expression): First

    Definition Classes
    ExpressionConversions
  31. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  32. def get(key: String, defaultValue: String): String

    Definition Classes
    SQLConf
  33. def get(key: String): String

    Definition Classes
    SQLConf
  34. def getAll: Array[(String, String)]

    Definition Classes
    SQLConf
  35. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  36. def getOption(key: String): Option[String]

    Definition Classes
    SQLConf
  37. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  38. val hivePlanner: SparkPlanner with HiveStrategies

    Definition Classes
    HiveContext
  39. lazy val hiveconf: HiveConf

    SQLConf and HiveConf contracts: when the hive session is first initialized, params in HiveConf will get picked up by the SQLConf.

    SQLConf and HiveConf contracts: when the hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside hql() or sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  40. def hiveql(hqlQuery: String): SchemaRDD

    Executes a query expressed in HiveQL using Spark, returning the result as a SchemaRDD.

    Executes a query expressed in HiveQL using Spark, returning the result as a SchemaRDD.

    Definition Classes
    HiveContext
  41. def hql(hqlQuery: String): SchemaRDD

    An alias for hiveql.

    An alias for hiveql.

    Definition Classes
    HiveContext
  42. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  43. def isCached(tableName: String): Boolean

    Definition Classes
    SQLContext
  44. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  45. def jsonFile(path: String, samplingRatio: Double): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  46. def jsonFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  47. def jsonRDD(json: RDD[String], samplingRatio: Double): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  48. def jsonRDD(json: RDD[String]): SchemaRDD

    Definition Classes
    SQLContext
  49. lazy val logger: Logger

    Attributes
    protected
    Definition Classes
    Logging
  50. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  51. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  52. def lower(e: Expression): Lower

    Definition Classes
    ExpressionConversions
  53. def max(e: Expression): Max

    Definition Classes
    ExpressionConversions
  54. lazy val metastorePath: String

  55. def min(e: Expression): Min

    Definition Classes
    ExpressionConversions
  56. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  57. final def notify(): Unit

    Definition Classes
    AnyRef
  58. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  59. val optimizer: Optimizer.type

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  60. val outputBuffer: OutputStream { ... /* 4 definitions in type refinement */ }

    Attributes
    protected
    Definition Classes
    HiveContext
  61. def parquetFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  62. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  63. val parser: SqlParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  64. val planner: SparkPlanner with HiveStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext → SQLContext
  65. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  66. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Definition Classes
    SQLContext
  67. def runHive(cmd: String, maxRows: Int = 1000): Seq[String]

    Execute the command using Hive and return the results as a sequence.

    Execute the command using Hive and return the results as a sequence. Each element in the sequence is one row.

    Attributes
    protected
    Definition Classes
    HiveContext
  68. def runSqlHive(sql: String): Seq[String]

    Runs the specified SQL query using Hive.

    Runs the specified SQL query using Hive.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContext
  69. lazy val sessionState: SessionState

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  70. def set(key: String, value: String): Unit

    Definition Classes
    HiveContext → SQLConf
  71. def set(props: Properties): Unit

    Definition Classes
    SQLConf
  72. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  73. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  74. def sql(sqlText: String): SchemaRDD

    Definition Classes
    SQLContext
  75. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  76. def sum(e: Expression): Sum

    Definition Classes
    ExpressionConversions
  77. def sumDistinct(e: Expression): SumDistinct

    Definition Classes
    ExpressionConversions
  78. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  79. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  80. def table(tableName: String): SchemaRDD

    Definition Classes
    SQLContext
  81. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  82. def toDebugString: String

    Definition Classes
    SQLConf
  83. def toString(): String

    Definition Classes
    AnyRef → Any
  84. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  85. def upper(e: Expression): Upper

    Definition Classes
    ExpressionConversions
  86. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  87. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  88. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  89. lazy val warehousePath: String

Inherited from HiveContext

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from ExpressionConversions

Inherited from SQLConf

Inherited from com.typesafe.scalalogging.slf4j.Logging

Inherited from AnyRef

Inherited from Any

Ungrouped