pl.touk.nussknacker.engine.avro.source
- provides a set of strategies for serialization and deserialization while event processing and/or testing.
Basic implementation of new source creation.
Basic implementation of new source creation. Override this method to create custom KafkaSource.
- provides a set of strategies for serialization and deserialization while event processing and/or testing.
- provides a set of strategies for serialization and deserialization while event processing and/or testing.
(Since version ) see corresponding Javadoc for more information.
Base implementation of KafkaSource factory with Avro schema support. It is based on GenericNodeTransformation to - allow key and value type identification based on Schema Registry and - allow Context initialization with event's value, key and metadata You can provide schemas for both key and value. When useStringForKey = true (see KafkaConfig) the contents of event's key are treated as String (this is default scenario). Reader schema used in runtime is determined by topic and version. Reader schema can be different than schema used by writer (e.g. when writer produces event with new schema), in that case "schema evolution" may be required. For SpecificRecord use SpecificRecordKafkaAvroSourceFactory. Assumptions: 1. Every event that comes in has its key and value schemas registered in Schema Registry. 2. Avro payload must include schema id for both Generic and Specific records (to provide "schema evolution" we need to know the exact writers schema).
- type of event's key, used to determine if key object is Specific or Generic (for GenericRecords use Any)
- type of event's value, used to determine if value object is Specific or Generic (for GenericRecords use Any)