com.github.benfradet.spark.kafka
Class used for writing DStreams to Kafka
Class used to write DStreams and RDDs to Kafka Example usage:
import java.util.Properties import com.github.benfradet.spark.kafka.writer.KafkaWriter._ import org.apache.kafka.common.serialization.StringSerializer val topic = "my-topic" val producerConfig = { val p = new Properties() p.setProperty("bootstrap.servers", "127.0.0.1:9092") p.setProperty("key.serializer", classOf[StringSerializer].getName) p.setProperty("value.serializer", classOf[StringSerializer].getName) p } val dStream: DStream[String] = ... dStream.writeToKafka( producerConfig, s => new ProducerRecord[String, String](topic, s) ) val rdd: RDD[String] = ... rdd.writeToKafka( producerConfig, s => new ProducerRecord[String, String](localTopic, s) )
Class used for writing RDDs to Kafka
Implicit conversions between DStream -> KafkaWriter and RDD -> KafkaWriter
Class used to write DStreams and RDDs to Kafka Example usage: