Create a sink that is aware of the committable offset from a Consumer.committableSource.
Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.
It publishes records to Kafka topics conditionally:
- Message publishes a single message to its topic, and commits the offset
- MultiMessage publishes all messages in its records
field, and commits the offset
- PassThroughMessage does not publish anything, but commits the offset
Note that there is always a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.
Supports sharing a Kafka Producer instance.
Create a sink that is aware of the committable offset from a Consumer.committableSource.
Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.
It publishes records to Kafka topics conditionally:
- Message publishes a single message to its topic, and commits the offset
- MultiMessage publishes all messages in its records
field, and commits the offset
- PassThroughMessage does not publish anything, but commits the offset
Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.
Create a flow to conditionally publish records to Kafka topics and then pass it on.
Create a flow to conditionally publish records to Kafka topics and then pass it on.
It publishes records to Kafka topics conditionally:
- Message publishes a single message to its topic, and continues in the stream as Result
- MultiMessage publishes all messages in its records
field, and continues in the stream as MultiResult
- PassThroughMessage does not publish anything, and continues in the stream as PassThroughResult
The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.
Supports sharing a Kafka Producer instance.
Create a flow to conditionally publish records to Kafka topics and then pass it on.
Create a flow to conditionally publish records to Kafka topics and then pass it on.
It publishes records to Kafka topics conditionally:
- Message publishes a single message to its topic, and continues in the stream as Result
- MultiMessage publishes all messages in its records
field, and continues in the stream as MultiResult
- PassThroughMessage does not publish anything, and continues in the stream as PassThroughResult
The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.
Create a sink for publishing records to Kafka topics.
Create a sink for publishing records to Kafka topics.
The Kafka ProducerRecord contains the topic name to which the record is being sent, an optional partition number, and an optional key and value.
Supports sharing a Kafka Producer instance.
Create a sink for publishing records to Kafka topics.
Create a sink for publishing records to Kafka topics.
The Kafka ProducerRecord contains the topic name to which the record is being sent, an optional partition number, and an optional key and value.
Create a sink that is aware of the committable offset from a Consumer.committableSource.
Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.
It publishes records to Kafka topics conditionally:
- Message publishes a single message to its topic, and commits the offset
- MultiMessage publishes all messages in its records
field, and commits the offset
- PassThroughMessage does not publish anything, but commits the offset
Note that there is always a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.
Supports sharing a Kafka Producer instance.
(Since version 1.0-RC1) use committableSink instead
Create a sink that is aware of the committable offset from a Consumer.committableSource.
Create a sink that is aware of the committable offset from a Consumer.committableSource. It will commit the consumer offset when the message has been published successfully to the topic.
It publishes records to Kafka topics conditionally:
- Message publishes a single message to its topic, and commits the offset
- MultiMessage publishes all messages in its records
field, and commits the offset
- PassThroughMessage does not publish anything, but commits the offset
Note that there is a risk that something fails after publishing but before committing, so it is "at-least once delivery" semantics.
(Since version 1.0-RC1) use committableSink instead
Create a flow to publish records to Kafka topics and then pass it on.
Create a flow to publish records to Kafka topics and then pass it on.
The records must be wrapped in a Message and continue in the stream as Result.
The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.
Supports sharing a Kafka Producer instance.
(Since version 0.21) prefer flexiFlow over this flow implementation
Create a flow to publish records to Kafka topics and then pass it on.
Create a flow to publish records to Kafka topics and then pass it on.
The records must be wrapped in a Message and continue in the stream as Result.
The messages support the possibility to pass through arbitrary data, which can for example be a CommittableOffset or CommittableOffsetBatch that can be committed later in the flow.
(Since version 0.21) prefer flexiFlow over this flow implementation
Akka Stream connector for publishing messages to Kafka topics.