org.apache.spark.sql.crossdata

XDContext

class XDContext extends SQLContext with Logging

CrossdataContext leverages the features of SQLContext and adds some features of the Crossdata system.

Self Type
XDContext
Linear Supertypes
SQLContext, Serializable, Serializable, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. XDContext
  2. SQLContext
  3. Serializable
  4. Serializable
  5. Logging
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new XDContext(sc: SparkContext, config: Config)

  2. new XDContext(sc: SparkContext)

Type Members

  1. class XDPlanner extends execution.SparkPlanner with XDStrategies

    Annotations
    @transient()
  2. class QueryExecution extends execution.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.6.0) use org.apache.spark.sql.QueryExecution

  3. class SparkPlanner extends execution.SparkPlanner

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.6.0) use org.apache.spark.sql.SparkPlanner

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def addJar(path: String): Unit

    Add JAR file from XD Driver to the context

    Add JAR file from XD Driver to the context

    path

    The local path or hdfs path where SparkContext will take the JAR

    Definition Classes
    XDContext → SQLContext
  7. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    XDContext → SQLContext
  8. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schema: StructType): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schemaString: String): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  10. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  11. def baseRelationToDataFrame(baseRelation: BaseRelation): DataFrame

    Definition Classes
    SQLContext
  12. val cacheManager: execution.CacheManager

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  13. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  14. lazy val catalog: XDCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    XDContext → SQLContext
  15. def checkCatalogConnection: Boolean

    Check if there is Connection with the catalog

    Check if there is Connection with the catalog

    returns

    if connection is possible

  16. def clearCache(): Unit

    Definition Classes
    SQLContext
  17. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. lazy val conf: SQLConf

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  19. def createDataFrame(rows: Seq[Row], schema: StructType): DataFrame

  20. def createDataFrame(data: List[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  21. def createDataFrame(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  22. def createDataFrame(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  23. def createDataFrame(rows: List[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  24. def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  25. def createDataFrame(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  26. def createDataFrame[A <: Product](data: Seq[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  27. def createDataFrame[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  28. def createDataset[T](data: List[T])(implicit arg0: Encoder[T]): Dataset[T]

    Definition Classes
    SQLContext
  29. def createDataset[T](data: RDD[T])(implicit arg0: Encoder[T]): Dataset[T]

    Definition Classes
    SQLContext
  30. def createDataset[T](data: Seq[T])(implicit arg0: Encoder[T]): Dataset[T]

    Definition Classes
    SQLContext
  31. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  32. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  33. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  34. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  35. def createExternalTable(tableName: String, path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  36. def createExternalTable(tableName: String, path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  37. val ddlParser: XDDdlParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    XDContext → SQLContext
  38. def dialectClassName: String

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  39. def dropAllTables(): Unit

    Drops all tables in the persistent catalog.

    Drops all tables in the persistent catalog. It applies only to metadata, so data do not be deleted.

  40. def dropTable(tableIdentifier: TableIdentifier): Unit

    Drops the table in the persistent catalog.

    Drops the table in the persistent catalog. It applies only to metadata, so data do not be deleted.

    tableIdentifier

    the table to be dropped.

  41. def dropTempTable(tableName: String): Unit

    Definition Classes
    SQLContext
  42. lazy val emptyDataFrame: DataFrame

    Definition Classes
    SQLContext
  43. lazy val emptyResult: RDD[InternalRow]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  44. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  45. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  46. def executePlan(plan: LogicalPlan): execution.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  47. def executeSql(sql: String): execution.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  48. val experimental: ExperimentalMethods

    Definition Classes
    SQLContext
  49. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  50. lazy val functionRegistry: FunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    XDContext → SQLContext
  51. def getAllConfs: Map[String, String]

    Definition Classes
    SQLContext
  52. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  53. def getConf(key: String, defaultValue: String): String

    Definition Classes
    SQLContext
  54. def getConf(key: String): String

    Definition Classes
    SQLContext
  55. def getSQLDialect(): ParserDialect

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  56. def getSchema(beanClass: Class[_]): Seq[AttributeReference]

    Attributes
    protected
    Definition Classes
    SQLContext
  57. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  58. def importTables(datasource: String, opts: Map[String, String]): Unit

    Imports tables from a DataSource in the persistent catalog.

    Imports tables from a DataSource in the persistent catalog.

    datasource
    opts

  59. def isCached(tableName: String): Boolean

    Definition Classes
    SQLContext
  60. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  61. val isRootContext: Boolean

    Definition Classes
    SQLContext
  62. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  63. lazy val listenerManager: ExecutionListenerManager

    Definition Classes
    SQLContext
  64. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  65. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  66. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  67. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  68. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  69. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  70. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  71. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  72. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  73. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  74. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  75. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  76. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  77. def newSession(): SQLContext

    Definition Classes
    SQLContext
  78. final def notify(): Unit

    Definition Classes
    AnyRef
  79. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  80. lazy val optimizer: Optimizer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  81. def parseDataType(dataTypeString: String): DataType

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  82. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  83. val planner: execution.SparkPlanner

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    XDContext → SQLContext
  84. val prepareForExecution: RuleExecutor[SparkPlan]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  85. def range(start: Long, end: Long, step: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  86. def range(start: Long, end: Long): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  87. def range(end: Long): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  88. def read: DataFrameReader

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  89. val sc: SparkContext

    A SparkContext.

  90. lazy val securityManager: security.SecurityManager

    Attributes
    protected[org.apache.spark.sql.crossdata]
  91. def setConf(key: String, value: String): Unit

    Definition Classes
    SQLContext
  92. def setConf(props: Properties): Unit

    Definition Classes
    SQLContext
  93. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  94. def sql(sqlText: String): DataFrame

    Definition Classes
    XDContext → SQLContext
  95. val sqlParser: SparkSQLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  96. lazy val streamingCatalog: Option[XDStreamingCatalog]

    Attributes
    protected[org.apache.spark.sql.crossdata]
  97. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  98. def table(tableName: String): DataFrame

    Definition Classes
    SQLContext
  99. def tableNames(databaseName: String): Array[String]

    Definition Classes
    SQLContext
  100. def tableNames(): Array[String]

    Definition Classes
    SQLContext
  101. def tables(databaseName: String): DataFrame

    Definition Classes
    SQLContext
  102. def tables(): DataFrame

    Definition Classes
    SQLContext
  103. def toString(): String

    Definition Classes
    AnyRef → Any
  104. val udf: UDFRegistration

    Definition Classes
    SQLContext
  105. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  106. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  107. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  108. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def applySchema(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  2. def applySchema(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  3. def applySchema(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  4. def applySchema(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  5. def jdbc(url: String, table: String, theParts: Array[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.jdbc(). This will be removed in Spark 2.0.

  6. def jdbc(url: String, table: String, columnName: String, lowerBound: Long, upperBound: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.jdbc(). This will be removed in Spark 2.0.

  7. def jdbc(url: String, table: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.jdbc(). This will be removed in Spark 2.0.

  8. def jsonFile(path: String, samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  9. def jsonFile(path: String, schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  10. def jsonFile(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  11. def jsonRDD(json: JavaRDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  12. def jsonRDD(json: RDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  13. def jsonRDD(json: JavaRDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  14. def jsonRDD(json: RDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  15. def jsonRDD(json: JavaRDD[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  16. def jsonRDD(json: RDD[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  17. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).schema(schema).options(options).load(). This will be removed in Spark 2.0.

  18. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).schema(schema).options(options).load(). This will be removed in Spark 2.0.

  19. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).options(options).load(). This will be removed in Spark 2.0.

  20. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).options(options).load(). This will be removed in Spark 2.0.

  21. def load(path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).load(path). This will be removed in Spark 2.0.

  22. def load(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.load(path). This will be removed in Spark 2.0.

  23. def parquetFile(paths: String*): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated @varargs()
    Deprecated

    (Since version 1.4.0) Use read.parquet(). This will be removed in Spark 2.0.

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped