Object

akka.kafka.scaladsl

Producer

Related Doc: package scaladsl

Permalink

object Producer

Akka Stream connector for publishing messages to Kafka topics.

Source
Producer.scala
Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Producer
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def committableSink[K, V](settings: ProducerSettings[K, V], producer: Producer[K, V]): Sink[Envelope[K, V, Committable], Future[Done]]

    Permalink

    Create a sink that is aware of the committable offset from a Consumer.committableSource.

    Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

    It publishes records to Kafka topics conditionally:

    - Message publishes a single message to its topic, and commits the offset

    - MultiMessage publishes all messages in its records field, and commits the offset

    - PassThroughMessage does not publish anything, but commits the offset

    Note that there is always a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

    Supports sharing a Kafka Producer instance.

  7. def committableSink[K, V](settings: ProducerSettings[K, V]): Sink[Envelope[K, V, Committable], Future[Done]]

    Permalink

    Create a sink that is aware of the committable offset from a Consumer.committableSource.

    Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

    It publishes records to Kafka topics conditionally:

    - Message publishes a single message to its topic, and commits the offset

    - MultiMessage publishes all messages in its records field, and commits the offset

    - PassThroughMessage does not publish anything, but commits the offset

    Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. def flexiFlow[K, V, PassThrough](settings: ProducerSettings[K, V], producer: Producer[K, V]): Flow[Envelope[K, V, PassThrough], Results[K, V, PassThrough], NotUsed]

    Permalink

    Create a flow to conditionally publish records to Kafka topics and then pass it on.

    Create a flow to conditionally publish records to Kafka topics and then pass it on.

    It publishes records to Kafka topics conditionally:

    - Message publishes a single message to its topic, and continues in the stream as Result

    - MultiMessage publishes all messages in its records field, and continues in the stream as MultiResult

    - PassThroughMessage does not publish anything, and continues in the stream as PassThroughResult

    The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

    Supports sharing a Kafka Producer instance.

  12. def flexiFlow[K, V, PassThrough](settings: ProducerSettings[K, V]): Flow[Envelope[K, V, PassThrough], Results[K, V, PassThrough], NotUsed]

    Permalink

    Create a flow to conditionally publish records to Kafka topics and then pass it on.

    Create a flow to conditionally publish records to Kafka topics and then pass it on.

    It publishes records to Kafka topics conditionally:

    - Message publishes a single message to its topic, and continues in the stream as Result

    - MultiMessage publishes all messages in its records field, and continues in the stream as MultiResult

    - PassThroughMessage does not publish anything, and continues in the stream as PassThroughResult

    The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

  13. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. def plainSink[K, V](settings: ProducerSettings[K, V], producer: Producer[K, V]): Sink[ProducerRecord[K, V], Future[Done]]

    Permalink

    Create a sink for publishing records to Kafka topics.

    Create a sink for publishing records to Kafka topics.

    The Kafka ProducerRecord contains the topic name to which the record is being sent, an optional partition number, and an optional key and value.

    Supports sharing a Kafka Producer instance.

  20. def plainSink[K, V](settings: ProducerSettings[K, V]): Sink[ProducerRecord[K, V], Future[Done]]

    Permalink

    Create a sink for publishing records to Kafka topics.

    Create a sink for publishing records to Kafka topics.

    The Kafka ProducerRecord contains the topic name to which the record is being sent, an optional partition number, and an optional key and value.

  21. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  22. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  23. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def commitableSink[K, V](settings: ProducerSettings[K, V], producer: Producer[K, V]): Sink[Envelope[K, V, Committable], Future[Done]]

    Permalink

    Create a sink that is aware of the committable offset from a Consumer.committableSource.

    Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

    It publishes records to Kafka topics conditionally:

    - Message publishes a single message to its topic, and commits the offset

    - MultiMessage publishes all messages in its records field, and commits the offset

    - PassThroughMessage does not publish anything, but commits the offset

    Note that there is always a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

    Supports sharing a Kafka Producer instance.

    Annotations
    @deprecated
    Deprecated

    (Since version 1.0-RC1) use committableSink instead

  2. def commitableSink[K, V](settings: ProducerSettings[K, V]): Sink[Envelope[K, V, Committable], Future[Done]]

    Permalink

    Create a sink that is aware of the committable offset from a Consumer.committableSource.

    Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.

    It publishes records to Kafka topics conditionally:

    - Message publishes a single message to its topic, and commits the offset

    - MultiMessage publishes all messages in its records field, and commits the offset

    - PassThroughMessage does not publish anything, but commits the offset

    Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.

    Annotations
    @deprecated
    Deprecated

    (Since version 1.0-RC1) use committableSink instead

  3. def flow[K, V, PassThrough](settings: ProducerSettings[K, V], producer: Producer[K, V]): Flow[Message[K, V, PassThrough], Result[K, V, PassThrough], NotUsed]

    Permalink

    Create a flow to publish records to Kafka topics and then pass it on.

    Create a flow to publish records to Kafka topics and then pass it on.

    The records must be wrapped in a Message and continue in the stream as Result.

    The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

    Supports sharing a Kafka Producer instance.

    Annotations
    @deprecated
    Deprecated

    (Since version 0.21) prefer flexiFlow over this flow implementation

  4. def flow[K, V, PassThrough](settings: ProducerSettings[K, V]): Flow[Message[K, V, PassThrough], Result[K, V, PassThrough], NotUsed]

    Permalink

    Create a flow to publish records to Kafka topics and then pass it on.

    Create a flow to publish records to Kafka topics and then pass it on.

    The records must be wrapped in a Message and continue in the stream as Result.

    The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.

    Annotations
    @deprecated
    Deprecated

    (Since version 0.21) prefer flexiFlow over this flow implementation

Inherited from AnyRef

Inherited from Any

Ungrouped