com.github.jparkie.spark.cassandra

SparkCassBulkWriter

class SparkCassBulkWriter[T] extends Serializable with Logging

Linear Supertypes
Logging, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkCassBulkWriter
  2. Logging
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkCassBulkWriter(cassandraConnector: CassandraConnector, tableDef: TableDef, columnSelector: IndexedSeq[ColumnRef], rowWriter: RowWriter[T], sparkCassWriteConf: SparkCassWriteConf, sparkCassServerConf: SparkCassServerConf)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. val columnNames: Seq[String]

  9. val columns: Seq[ColumnDef]

  10. val defaultTTL: Option[Long]

  11. val defaultTimestamp: Option[Long]

  12. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  16. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  17. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  18. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  19. val keyspaceName: String

  20. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  21. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  22. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  23. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  24. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  25. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  26. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  27. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  28. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  29. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  30. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  31. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  32. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  33. final def notify(): Unit

    Definition Classes
    AnyRef
  34. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  35. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  36. val tableName: String

  37. def toString(): String

    Definition Classes
    AnyRef → Any
  38. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. def write(taskContext: TaskContext, data: Iterator[T]): Unit

    Inserts T to Cassandra by creating SSTables to a temporary directory then streaming them directly to Cassandra nodes utilizing the Transport Layer.

    Inserts T to Cassandra by creating SSTables to a temporary directory then streaming them directly to Cassandra nodes utilizing the Transport Layer.

    taskContext

    The TaskContext provided by the Spark DAGScheduler.

    data

    The set of T to persist.

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped