org.apache.spark.sql.cassandra

CassandraSQLContext

Related Docs: object CassandraSQLContext | package cassandra

class CassandraSQLContext extends SQLContext

Allows to execute SQL queries against Cassandra and access results as SchemaRDD collections. Predicate pushdown to Cassandra is supported.

Example:

import com.datastax.spark.connector._

val sparkMasterHost = "127.0.0.1"
val cassandraHost = "127.0.0.1"

// Tell Spark the address of one Cassandra node:
val conf = new SparkConf(true).set("spark.cassandra.connection.host", cassandraHost)

// Connect to the Spark cluster:
val sc = new SparkContext("spark://" + sparkMasterHost + ":7077", "example", conf)

// Create CassandraSQLContext:
val cc = new CassandraSQLContext(sc)

// Execute SQL query:
val rdd = cc.sql("SELECT * FROM keyspace.table ...")
Linear Supertypes
SQLContext, Serializable, Serializable, UDFRegistration, ExpressionConversions, CacheManager, SQLConf, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CassandraSQLContext
  2. SQLContext
  3. Serializable
  4. Serializable
  5. UDFRegistration
  6. ExpressionConversions
  7. CacheManager
  8. SQLConf
  9. Logging
  10. AnyRef
  11. Any
Implicitly
  1. by any2stringadd
  2. by StringFormat
  3. by Ensuring
  4. by ArrowAssoc
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CassandraSQLContext(sc: SparkContext)

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  6. abstract class QueryExecution extends AnyRef

    Attributes
    protected
    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  7. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. def +(other: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to any2stringadd[CassandraSQLContext] performed by method any2stringadd in scala.Predef.
    Definition Classes
    any2stringadd
  4. def ->[B](y: B): (CassandraSQLContext, B)

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  6. def abs(e: Expression): Abs

    Definition Classes
    ExpressionConversions
  7. def addClusterLevelCassandraConnConf(cluster: String, conf: CassandraConnectorConf): CassandraSQLContext

    Add cluster level write configuration settings

  8. def addClusterLevelCassandraConnConf(cluster: String, conf: SparkConf): CassandraSQLContext

    Add cluster level write configuration settings

  9. def addClusterLevelReadConf(cluster: String, conf: ReadConf): CassandraSQLContext

    Add cluster level read configuration settings

  10. def addClusterLevelReadConf(cluster: String, conf: SparkConf): CassandraSQLContext

    Add cluster level read configuration settings

  11. def addClusterLevelWriteConf(cluster: String, conf: WriteConf): CassandraSQLContext

    Add cluster level write configuration settings

  12. def addClusterLevelWriteConf(cluster: String, conf: SparkConf): CassandraSQLContext

    Add cluster level write configuration settings

  13. def addKeyspaceLevelReadConf(keyspace: String, conf: ReadConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level read configuration settings.

    Add keyspace level read configuration settings. Set cluster to None for a single cluster

  14. def addKeyspaceLevelReadConf(keyspace: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level read configuration settings.

    Add keyspace level read configuration settings. Set cluster to None for a single cluster

  15. def addKeyspaceLevelWriteConf(keyspace: String, writeConf: WriteConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level write configuration settings.

    Add keyspace level write configuration settings. Set cluster to None for a single cluster

  16. def addKeyspaceLevelWriteConf(keyspace: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add keyspace level write configuration settings.

    Add keyspace level write configuration settings. Set cluster to None for a single cluster

  17. def addTableReadConf(keyspace: String, table: String, conf: ReadConf, cluster: Option[String]): CassandraSQLContext

    Add table level read configuration settings.

    Add table level read configuration settings. Set cluster to None for a single cluster

  18. def addTableReadConf(keyspace: String, table: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add table level read configuration settings.

    Add table level read configuration settings. Set cluster to None for a single cluster

  19. def addTableWriteConf(keyspace: String, table: String, conf: WriteConf, cluster: Option[String]): CassandraSQLContext

    Add table level write configuration settings.

    Add table level write configuration settings. Set cluster to None for a single cluster

  20. def addTableWriteConf(keyspace: String, table: String, conf: SparkConf, cluster: Option[String]): CassandraSQLContext

    Add table level write configuration settings.

    Add table level write configuration settings. Set cluster to None for a single cluster

  21. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  22. def applySchema(rowRDD: RDD[Row], schema: StructType): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  23. def approxCountDistinct(e: Expression, rsd: Double): ApproxCountDistinct

    Definition Classes
    ExpressionConversions
  24. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  25. def avg(e: Expression): Average

    Definition Classes
    ExpressionConversions
  26. implicit def baseRelationToSchemaRDD(baseRelation: BaseRelation): SchemaRDD

    Definition Classes
    SQLContext
  27. implicit def bigDecimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  28. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  29. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  30. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  31. def cacheTable(tableName: String): Unit

    Definition Classes
    CacheManager
  32. def cassandraSql(cassandraQuery: String): SchemaRDD

    Executes SQL query against Cassandra and returns SchemaRDD representing the result.

  33. lazy val catalog: CassandraCatalog with OverrideCatalog

    A catalyst metadata catalog that points to Cassandra.

    A catalyst metadata catalog that points to Cassandra.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  34. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. val conf: SparkConf

  36. def count(e: Expression): Count

    Definition Classes
    ExpressionConversions
  37. def countDistinct(e: Expression*): CountDistinct

    Definition Classes
    ExpressionConversions
  38. def createParquetFile[A <: Product](path: String, allowExisting: Boolean, conf: Configuration)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  39. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Definition Classes
    SQLContext
  40. implicit def dateToLiteral(d: Date): Literal

    Definition Classes
    ExpressionConversions
  41. val ddlParser: DDLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  42. implicit def decimalToLiteral(d: Decimal): Literal

    Definition Classes
    ExpressionConversions
  43. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  44. def dropTempTable(tableName: String): Unit

    Definition Classes
    SQLContext
  45. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  46. def ensuring(cond: (CassandraSQLContext) ⇒ Boolean, msg: ⇒ Any): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  47. def ensuring(cond: (CassandraSQLContext) ⇒ Boolean): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  48. def ensuring(cond: Boolean, msg: ⇒ Any): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  49. def ensuring(cond: Boolean): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  50. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  51. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  52. def executePlan(plan: LogicalPlan): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  53. def executeSql(sql: String): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  54. var extraStrategies: Seq[Strategy]

    Definition Classes
    SQLContext
  55. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  56. def first(e: Expression): First

    Definition Classes
    ExpressionConversions
  57. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  58. def formatted(fmtstr: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to StringFormat[CassandraSQLContext] performed by method StringFormat in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  59. lazy val functionRegistry: FunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  60. def getAllConfs: Map[String, String]

    Definition Classes
    SQLConf
  61. def getCassandraConnConf(cluster: Option[String]): CassandraConnectorConf

    Get Cassandra connection configuration settings by the order of cluster level, default settings

  62. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  63. def getConf(key: String, defaultValue: String): String

    Definition Classes
    SQLConf
  64. def getConf(key: String): String

    Definition Classes
    SQLConf
  65. def getKeyspace: String

    Returns keyspace set previously by setKeyspace or throws IllegalStateException if keyspace has not been set yet.

  66. def getReadConf(keyspace: String, table: String, cluster: Option[String]): ReadConf

    Get read configuration settings by the order of table level, keyspace level, cluster level, default settings

  67. def getWriteConf(keyspace: String, table: String, cluster: Option[String]): WriteConf

    Get write configuration settings by the order of table level, keyspace level, cluster level, default settings

  68. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  69. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  70. def isCached(tableName: String): Boolean

    Definition Classes
    CacheManager
  71. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  72. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  73. def jsonFile(path: String, samplingRatio: Double): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  74. def jsonFile(path: String, schema: StructType): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  75. def jsonFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  76. def jsonRDD(json: RDD[String], samplingRatio: Double): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  77. def jsonRDD(json: RDD[String], schema: StructType): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  78. def jsonRDD(json: RDD[String]): SchemaRDD

    Definition Classes
    SQLContext
  79. def last(e: Expression): Last

    Definition Classes
    ExpressionConversions
  80. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  81. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  82. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  83. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  84. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  85. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  86. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  87. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  88. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  89. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  90. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  91. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  92. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  93. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  94. def lower(e: Expression): Lower

    Definition Classes
    ExpressionConversions
  95. def max(e: Expression): Max

    Definition Classes
    ExpressionConversions
  96. def min(e: Expression): Min

    Definition Classes
    ExpressionConversions
  97. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  98. final def notify(): Unit

    Definition Classes
    AnyRef
  99. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  100. lazy val optimizer: Optimizer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  101. def parquetFile(path: String): SchemaRDD

    Definition Classes
    SQLContext
  102. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  103. val planner: SparkPlanner with CassandraStrategies { val strategies: Seq[org.apache.spark.sql.Strategy] }

    Modified Catalyst planner that does Cassandra-specific predicate pushdown

    Modified Catalyst planner that does Cassandra-specific predicate pushdown

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  104. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  105. def registerFunction[T](name: String, func: Function22[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  106. def registerFunction[T](name: String, func: Function21[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  107. def registerFunction[T](name: String, func: Function20[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  108. def registerFunction[T](name: String, func: Function19[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  109. def registerFunction[T](name: String, func: Function18[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  110. def registerFunction[T](name: String, func: Function17[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  111. def registerFunction[T](name: String, func: Function16[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  112. def registerFunction[T](name: String, func: Function15[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  113. def registerFunction[T](name: String, func: Function14[_, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  114. def registerFunction[T](name: String, func: Function13[_, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  115. def registerFunction[T](name: String, func: Function12[_, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  116. def registerFunction[T](name: String, func: Function11[_, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  117. def registerFunction[T](name: String, func: Function10[_, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  118. def registerFunction[T](name: String, func: Function9[_, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  119. def registerFunction[T](name: String, func: Function8[_, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  120. def registerFunction[T](name: String, func: Function7[_, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  121. def registerFunction[T](name: String, func: Function6[_, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  122. def registerFunction[T](name: String, func: Function5[_, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  123. def registerFunction[T](name: String, func: Function4[_, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  124. def registerFunction[T](name: String, func: Function3[_, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  125. def registerFunction[T](name: String, func: Function2[_, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  126. def registerFunction[T](name: String, func: Function1[_, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  127. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Definition Classes
    SQLContext
  128. def setConf(key: String, value: String): Unit

    Definition Classes
    SQLConf
  129. def setConf(props: Properties): Unit

    Definition Classes
    SQLConf
  130. def setKeyspace(ks: String): Unit

    Sets default Cassandra keyspace to be used when accessing tables with unqualified names.

  131. val settings: Map[String, String]

    Attributes
    protected[org.apache.spark]
    Definition Classes
    SQLConf
  132. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  133. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  134. def sql(cassandraQuery: String): SchemaRDD

    Delegates to cassandraSql

    Delegates to cassandraSql

    Definition Classes
    CassandraSQLContext → SQLContext
  135. val sqlParser: SparkSQLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  136. def sqrt(e: Expression): Sqrt

    Definition Classes
    ExpressionConversions
  137. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  138. def sum(e: Expression): Sum

    Definition Classes
    ExpressionConversions
  139. def sumDistinct(e: Expression): SumDistinct

    Definition Classes
    ExpressionConversions
  140. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  141. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  142. def table(tableName: String): SchemaRDD

    Definition Classes
    SQLContext
  143. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  144. def toString(): String

    Definition Classes
    AnyRef → Any
  145. def uncacheTable(tableName: String): Unit

    Definition Classes
    CacheManager
  146. def upper(e: Expression): Upper

    Definition Classes
    ExpressionConversions
  147. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  148. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  149. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  150. def [B](y: B): (CassandraSQLContext, B)

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from UDFRegistration

Inherited from ExpressionConversions

Inherited from CacheManager

Inherited from SQLConf

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion any2stringadd from CassandraSQLContext to any2stringadd[CassandraSQLContext]

Inherited by implicit conversion StringFormat from CassandraSQLContext to StringFormat[CassandraSQLContext]

Inherited by implicit conversion Ensuring from CassandraSQLContext to Ensuring[CassandraSQLContext]

Inherited by implicit conversion ArrowAssoc from CassandraSQLContext to ArrowAssoc[CassandraSQLContext]

Ungrouped