Object

org.apache.spark.streaming.kafka010

KafkaUtils

Related Doc: package kafka010

Permalink

object KafkaUtils extends Logging

:: Experimental :: object for constructing Kafka streams and RDDs

Annotations
@Experimental()
Linear Supertypes
Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. KafkaUtils
  2. Logging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def createDirectStream[K, V](jssc: JavaStreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V], perPartitionConfig: PerPartitionConfig): JavaInputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.preferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.subscribe, see ConsumerStrategies for more details

    perPartitionConfig

    configuration of settings such as max rate on a per-partition basis. see PerPartitionConfig for more details.

    Annotations
    @Experimental()
  7. def createDirectStream[K, V](jssc: JavaStreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V]): JavaInputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.preferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.subscribe, see ConsumerStrategies for more details

    Annotations
    @Experimental()
  8. def createDirectStream[K, V](ssc: StreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V], perPartitionConfig: PerPartitionConfig): InputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.preferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.subscribe, see ConsumerStrategies for more details.

    perPartitionConfig

    configuration of settings such as max rate on a per-partition basis. see PerPartitionConfig for more details.

    Annotations
    @Experimental()
  9. def createDirectStream[K, V](ssc: StreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V]): InputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition. The spark configuration spark.streaming.kafka.maxRatePerPartition gives the maximum number of messages per second that each partition will accept.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.preferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.subscribe, see ConsumerStrategies for more details

    Annotations
    @Experimental()
  10. def createRDD[K, V](jsc: JavaSparkContext, kafkaParams: Map[String, AnyRef], offsetRanges: Array[OffsetRange], locationStrategy: LocationStrategy): JavaRDD[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Java constructor for a batch-oriented interface for consuming from Kafka.

    :: Experimental :: Java constructor for a batch-oriented interface for consuming from Kafka. Starting and ending offsets are specified in advance, so that you can control exactly-once semantics.

    K

    type of Kafka message key

    V

    type of Kafka message value

    kafkaParams

    Kafka configuration parameters. Requires "bootstrap.servers" to be set with Kafka broker(s) specified in host1:port1,host2:port2 form.

    offsetRanges

    offset ranges that define the Kafka data belonging to this RDD

    locationStrategy

    In most cases, pass in LocationStrategies.preferConsistent, see LocationStrategies for more details.

    Annotations
    @Experimental()
  11. def createRDD[K, V](sc: SparkContext, kafkaParams: Map[String, AnyRef], offsetRanges: Array[OffsetRange], locationStrategy: LocationStrategy): RDD[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Scala constructor for a batch-oriented interface for consuming from Kafka.

    :: Experimental :: Scala constructor for a batch-oriented interface for consuming from Kafka. Starting and ending offsets are specified in advance, so that you can control exactly-once semantics.

    K

    type of Kafka message key

    V

    type of Kafka message value

    kafkaParams

    Kafka configuration parameters. Requires "bootstrap.servers" to be set with Kafka broker(s) specified in host1:port1,host2:port2 form.

    offsetRanges

    offset ranges that define the Kafka data belonging to this RDD

    locationStrategy

    In most cases, pass in LocationStrategies.preferConsistent, see LocationStrategies for more details.

    Annotations
    @Experimental()
  12. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  16. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  17. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  18. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  19. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  20. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  21. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  22. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  23. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  25. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  26. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  27. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  28. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  29. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  30. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  31. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  32. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  33. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  34. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  35. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  36. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  37. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped