org.apache.spark.sql.cassandra

CassandraSQLContext

class CassandraSQLContext extends SQLContext

Allows to execute SQL queries against Cassandra and access results as SchemaRDD collections. Predicate pushdown to Cassandra is supported.

Example:

import com.datastax.spark.connector._

val sparkMasterHost = "127.0.0.1"
val cassandraHost = "127.0.0.1"

// Tell Spark the address of one Cassandra node:
val conf = new SparkConf(true).set("spark.cassandra.connection.host", cassandraHost)

// Connect to the Spark cluster:
val sc = new SparkContext("spark://" + sparkMasterHost + ":7077", "example", conf)

// Create CassandraSQLContext:
val cc = new CassandraSQLContext(sc)

// Execute SQL query:
val rdd = cc.sql("SELECT * FROM keyspace.table ...")
Linear Supertypes
SQLContext, Serializable, Serializable, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CassandraSQLContext
  2. SQLContext
  3. Serializable
  4. Serializable
  5. Logging
  6. AnyRef
  7. Any
Implicitly
  1. by any2stringadd
  2. by any2stringfmt
  3. by any2ArrowAssoc
  4. by any2Ensuring
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CassandraSQLContext(sc: SparkContext)

Type Members

  1. class QueryExecution extends execution.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.6.0) use org.apache.spark.sql.QueryExecution

  2. class SparkPlanner extends execution.SparkPlanner

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.6.0) use org.apache.spark.sql.SparkPlanner

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. def +(other: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to StringAdd performed by method any2stringadd in scala.Predef.
    Definition Classes
    StringAdd
  5. def ->[B](y: B): (CassandraSQLContext, B)

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method any2ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  6. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  7. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  8. def addJar(path: String): Unit

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  10. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schema: StructType): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  11. def applySchemaToPythonRDD(rdd: RDD[Array[Any]], schemaString: String): DataFrame

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  12. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  13. def baseRelationToDataFrame(baseRelation: BaseRelation): DataFrame

    Definition Classes
    SQLContext
  14. val cacheManager: execution.CacheManager

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  15. def cacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  16. def cassandraSql(cassandraQuery: String): DataFrame

    Executes SQL query against Cassandra and returns DataFrame representing the result.

  17. lazy val catalog: CassandraCatalog

    A catalyst metadata catalog that points to Cassandra.

    A catalyst metadata catalog that points to Cassandra.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  18. def clearCache(): Unit

    Definition Classes
    SQLContext
  19. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  20. lazy val conf: SQLConf

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  21. def createDataFrame(data: List[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  22. def createDataFrame(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  23. def createDataFrame(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
  24. def createDataFrame(rows: List[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  25. def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  26. def createDataFrame(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  27. def createDataFrame[A <: Product](data: Seq[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  28. def createDataFrame[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  29. def createDataset[T](data: List[T])(implicit arg0: Encoder[T]): Dataset[T]

    Definition Classes
    SQLContext
  30. def createDataset[T](data: RDD[T])(implicit arg0: Encoder[T]): Dataset[T]

    Definition Classes
    SQLContext
  31. def createDataset[T](data: Seq[T])(implicit arg0: Encoder[T]): Dataset[T]

    Definition Classes
    SQLContext
  32. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  33. def createExternalTable(tableName: String, source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  34. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  35. def createExternalTable(tableName: String, source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  36. def createExternalTable(tableName: String, path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  37. def createExternalTable(tableName: String, path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  38. val ddlParser: DDLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  39. def dialectClassName: String

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  40. def dropTempTable(tableName: String): Unit

    Definition Classes
    SQLContext
  41. lazy val emptyDataFrame: DataFrame

    Definition Classes
    SQLContext
  42. lazy val emptyResult: RDD[InternalRow]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  43. def ensuring(cond: (CassandraSQLContext) ⇒ Boolean, msg: ⇒ Any): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method any2Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  44. def ensuring(cond: (CassandraSQLContext) ⇒ Boolean): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method any2Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  45. def ensuring(cond: Boolean, msg: ⇒ Any): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method any2Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  46. def ensuring(cond: Boolean): CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method any2Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  47. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  48. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  49. def executePlan(plan: LogicalPlan): execution.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    CassandraSQLContext → SQLContext
  50. def executeSql(sql: String): execution.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  51. val experimental: ExperimentalMethods

    Definition Classes
    SQLContext
  52. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  53. def formatted(fmtstr: String): String

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to StringFormat performed by method any2stringfmt in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  54. lazy val functionRegistry: FunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  55. def getAllConfs: Map[String, String]

    Definition Classes
    SQLContext
  56. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  57. def getCluster: String

    Get current used cluster name

  58. def getConf(key: String, defaultValue: String): String

    Definition Classes
    SQLContext
  59. def getConf(key: String): String

    Definition Classes
    SQLContext
  60. def getKeyspace: String

    Returns keyspace/database set previously by setKeyspace or throws IllegalStateException if keyspace has not been set yet.

  61. def getSQLDialect(): ParserDialect

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  62. def getSchema(beanClass: Class[_]): Seq[AttributeReference]

    Attributes
    protected
    Definition Classes
    SQLContext
  63. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  64. def isCached(tableName: String): Boolean

    Definition Classes
    SQLContext
  65. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  66. val isRootContext: Boolean

    Definition Classes
    SQLContext
  67. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  68. lazy val listenerManager: ExecutionListenerManager

    Definition Classes
    SQLContext
  69. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  70. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  71. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  72. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  73. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  74. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  75. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  76. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  77. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  78. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  79. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  80. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  81. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  82. def newSession(): SQLContext

    Definition Classes
    SQLContext
  83. final def notify(): Unit

    Definition Classes
    AnyRef
  84. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  85. lazy val optimizer: Optimizer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  86. def parseDataType(dataTypeString: String): DataType

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  87. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  88. val planner: execution.SparkPlanner

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  89. val prepareForExecution: RuleExecutor[SparkPlan]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  90. def range(start: Long, end: Long, step: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  91. def range(start: Long, end: Long): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  92. def range(end: Long): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  93. def read: DataFrameReader

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  94. def setCluster(cluster: String): Unit

    Set current used cluster name

  95. def setConf(cluster: String, keyspace: String, options: Map[String, String]): CassandraSQLContext

    Set the Spark Cassandra Connector configuration parameters which will be used when accessing a given keyspace in a given cluster

  96. def setConf(cluster: String, options: Map[String, String]): CassandraSQLContext

    Set the Spark Cassandra Connector configuration parameters which will be used when accessing a given cluster

  97. def setConf(options: Map[String, String]): CassandraSQLContext

    Set the Spark Cassandra Connector configuration parameters

  98. def setConf(key: String, value: String): Unit

    Definition Classes
    SQLContext
  99. def setConf(props: Properties): Unit

    Definition Classes
    SQLContext
  100. def setDatabase(db: String): Unit

    Set current used database name.

    Set current used database name. Database is equivalent to keyspace

  101. def setKeyspace(ks: String): Unit

    Set default Cassandra keyspace to be used when accessing tables with unqualified names.

  102. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  103. def sql(cassandraQuery: String): DataFrame

    Delegates to cassandraSql

    Delegates to cassandraSql

    Definition Classes
    CassandraSQLContext → SQLContext
  104. val sqlParser: SparkSQLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  105. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  106. def table(tableName: String): DataFrame

    Definition Classes
    SQLContext
  107. def tableNames(databaseName: String): Array[String]

    Definition Classes
    SQLContext
  108. def tableNames(): Array[String]

    Definition Classes
    SQLContext
  109. def tables(databaseName: String): DataFrame

    Definition Classes
    SQLContext
  110. def tables(): DataFrame

    Definition Classes
    SQLContext
  111. def toString(): String

    Definition Classes
    AnyRef → Any
  112. val udf: UDFRegistration

    Definition Classes
    SQLContext
  113. def uncacheTable(tableName: String): Unit

    Definition Classes
    SQLContext
  114. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  115. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  116. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  117. def [B](y: B): (CassandraSQLContext, B)

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method any2ArrowAssoc in scala.Predef.
    Definition Classes
    ArrowAssoc

Shadowed Implicit Value Members

  1. val self: Any

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to StringAdd performed by method any2stringadd in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraSQLContext: StringAdd).self
    Definition Classes
    StringAdd
  2. val self: Any

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to StringFormat performed by method any2stringfmt in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraSQLContext: StringFormat).self
    Definition Classes
    StringFormat

Deprecated Value Members

  1. def applySchema(rdd: JavaRDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  2. def applySchema(rdd: RDD[_], beanClass: Class[_]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  3. def applySchema(rowRDD: JavaRDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  4. def applySchema(rowRDD: RDD[Row], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) Use createDataFrame. This will be removed in Spark 2.0.

  5. def jdbc(url: String, table: String, theParts: Array[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.jdbc(). This will be removed in Spark 2.0.

  6. def jdbc(url: String, table: String, columnName: String, lowerBound: Long, upperBound: Long, numPartitions: Int): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.jdbc(). This will be removed in Spark 2.0.

  7. def jdbc(url: String, table: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.jdbc(). This will be removed in Spark 2.0.

  8. def jsonFile(path: String, samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  9. def jsonFile(path: String, schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  10. def jsonFile(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  11. def jsonRDD(json: JavaRDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  12. def jsonRDD(json: RDD[String], samplingRatio: Double): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  13. def jsonRDD(json: JavaRDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  14. def jsonRDD(json: RDD[String], schema: StructType): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  15. def jsonRDD(json: JavaRDD[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  16. def jsonRDD(json: RDD[String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.json(). This will be removed in Spark 2.0.

  17. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).schema(schema).options(options).load(). This will be removed in Spark 2.0.

  18. def load(source: String, schema: StructType, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).schema(schema).options(options).load(). This will be removed in Spark 2.0.

  19. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).options(options).load(). This will be removed in Spark 2.0.

  20. def load(source: String, options: Map[String, String]): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).options(options).load(). This will be removed in Spark 2.0.

  21. def load(path: String, source: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.format(source).load(path). This will be removed in Spark 2.0.

  22. def load(path: String): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.4.0) Use read.load(path). This will be removed in Spark 2.0.

  23. def parquetFile(paths: String*): DataFrame

    Definition Classes
    SQLContext
    Annotations
    @deprecated @varargs()
    Deprecated

    (Since version 1.4.0) Use read.parquet(). This will be removed in Spark 2.0.

  24. def x: CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to ArrowAssoc[CassandraSQLContext] performed by method any2ArrowAssoc in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraSQLContext: ArrowAssoc[CassandraSQLContext]).x
    Definition Classes
    ArrowAssoc
    Annotations
    @deprecated
    Deprecated

    (Since version 2.10.0) Use leftOfArrow instead

  25. def x: CassandraSQLContext

    Implicit information
    This member is added by an implicit conversion from CassandraSQLContext to Ensuring[CassandraSQLContext] performed by method any2Ensuring in scala.Predef.
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraSQLContext: Ensuring[CassandraSQLContext]).x
    Definition Classes
    Ensuring
    Annotations
    @deprecated
    Deprecated

    (Since version 2.10.0) Use resultOfEnsuring instead

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion any2stringadd from CassandraSQLContext to StringAdd

Inherited by implicit conversion any2stringfmt from CassandraSQLContext to StringFormat

Inherited by implicit conversion any2ArrowAssoc from CassandraSQLContext to ArrowAssoc[CassandraSQLContext]

Inherited by implicit conversion any2Ensuring from CassandraSQLContext to Ensuring[CassandraSQLContext]

Ungrouped