Class KafkaGSpec
- java.lang.Object
-
- com.privalia.qa.specs.BaseGSpec
-
- com.privalia.qa.specs.KafkaGSpec
-
public class KafkaGSpec extends BaseGSpec
Steps definitions for working with Apache Kafka- Author:
- José Fernández
-
-
Field Summary
-
Fields inherited from class com.privalia.qa.specs.BaseGSpec
commonspec
-
-
Constructor Summary
Constructors Constructor Description KafkaGSpec(CommonG spec)
Instantiates a new Kafka g spec.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description void
assertTopicContainsAvroMessageWithProperties(String topicName, String avroRecord, io.cucumber.datatable.DataTable dataTable)
Search the topic for the given Avro record.void
assertTopicContainsMessage(String topic, String content, String key)
Check if message existsvoid
assertTopicContainsMessageWithProperties(String topicName, String message, String isKey, io.cucumber.datatable.DataTable dataTable)
Reads message from topic with propertiesvoid
assertTopicContainsPartialAvroMessageWithProperties(String topicName, String atLeast, int expectedCount, io.cucumber.datatable.DataTable datatable)
Performs a partial property matching on the avro records returnedvoid
assertTopicDoesntExist(String topic_name)
Check that a kafka topic does not existvoid
assertTopicExists(String topic_name)
Check that a kafka topic existvoid
checkNumberOfPartitions(String topic_name, int numOfPartitions)
Check that the number of partitions is the expected for the given topic.void
configureConsumerProperties(io.cucumber.datatable.DataTable dataTable)
Modify consumer propertiesvoid
configureProducerProperties(io.cucumber.datatable.DataTable dataTable)
Modify producer propertiesvoid
connectToKafka(String zkHost, String zkPath)
Connect to Kafka.void
createNewAvroMessage(String recordName, String schemaFile, String seedFile, io.cucumber.datatable.DataTable table)
Creates an Avro record from the specified schema.void
createNewAvroMessageFromRegistry(String recordName, String versionNumber, String subject, String seedFile, io.cucumber.datatable.DataTable table)
Creates a new Avro record by reading the schema directly from the schema registry for the specified subject and versionvoid
createTopic(String topic_name, String ifExists)
Create a Kafka topic.void
deleteTopic(String topic_name)
Delete a Kafka topic.void
disconnectFromKafka()
Close the connection to kafka.void
modifyTopicPartitions(int numPartitions, String topic_name)
Increase partitions in kafka topicvoid
registerNewSchema(String subjectName, String filepath)
Adds a new schema to the schema register.void
sendAvroMessageToTopicWithProperties(String genericRecord, String topicName, String recordKey, io.cucumber.datatable.DataTable table)
Send the previously created Avro record.void
sendMessageToTopic(String message, String topic_name, String recordKey, String ifExists)
Sends a message to a Kafka topic.void
sendMessageToTopicWithProperties(String message, String topic_name, String recordKey, io.cucumber.datatable.DataTable table)
Sends a message to a Kafka topic with properties.void
setSchemaRegistryURL(String host)
Sets URL of schema registry.-
Methods inherited from class com.privalia.qa.specs.BaseGSpec
getCommonSpec
-
-
-
-
Constructor Detail
-
KafkaGSpec
public KafkaGSpec(CommonG spec)
Instantiates a new Kafka g spec.- Parameters:
spec
- the spec
-
-
Method Detail
-
connectToKafka
@Given("^I connect to kafka at \'(.+?)\'( using path \'(.+?)\')?$") public void connectToKafka(String zkHost, String zkPath) throws UnknownHostException
Connect to Kafka.Establish the connection to the Kafka cluster via the IP of the Zookeeper service. This is an initialization step necessary for all future steps
Example: Assuming zookeeper service is running at localhost:2181
Given I connect to kafka at 'localhost:2181'
- Parameters:
zkHost
- ZK hostzkPath
- ZK port- Throws:
UnknownHostException
- exception- See Also:
disconnectFromKafka()
-
createTopic
@When("^I create a Kafka topic named \'(.+?)\'( if it doesn\'t exists)?") public void createTopic(String topic_name, String ifExists) throws Exception
Create a Kafka topic.Creates a kafka topic with the given name. It can also create the topic only if it doesn't exists. All topics are created by default with 1 partition and a replication factor of 1
Example: Create the topic 'testqa'
Given I create a Kafka topic named 'testqa'
Example: Create the topic 'testqa' if it doesn't existsGiven I create a Kafka topic named 'testqa' if it doesn't exists
- Parameters:
topic_name
- topic nameifExists
- String fro matching optional text in Gherkin- Throws:
Exception
- Exception- See Also:
deleteTopic(String)
,assertTopicExists(String)
,modifyTopicPartitions(int, String)
-
deleteTopic
@When("^I delete a Kafka topic named \'(.+?)\'") public void deleteTopic(String topic_name) throws Exception
Delete a Kafka topic.Example: Delete the topic 'testqa'
When I delete a Kafka topic named 'testqa'
- Parameters:
topic_name
- topic name- Throws:
Exception
- Exception- See Also:
createTopic(String, String)
,assertTopicExists(String)
,modifyTopicPartitions(int, String)
-
modifyTopicPartitions
@When("^I increase \'(.+?)\' partitions in a Kafka topic named \'(.+?)\'") public void modifyTopicPartitions(int numPartitions, String topic_name) throws Exception
Increase partitions in kafka topicModify partitions in a Kafka topic by increasing the current number of partitions in the topic by the specified number. Mind that the number of partitions for a topic can only be increased once its created
Example: Increase partitions topic 'testqa'
Given I increase '1' partitions in a Kafka topic named 'testqa'
- Parameters:
numPartitions
- number of partitions to add to the current amount of partitions for the topictopic_name
- topic name- Throws:
Exception
- Exception- See Also:
deleteTopic(String)
,createTopic(String, String)
,modifyTopicPartitions(int, String)
-
sendMessageToTopic
@When("^I send a message \'(.+?)\' to the kafka topic named \'(.+?)\'( with key \'(.+?)\')?( if not exists)?$") public void sendMessageToTopic(String message, String topic_name, String recordKey, String ifExists) throws Exception
Sends a message to a Kafka topic.By default, this steps uses StringSerializer and StringDeserializer for the key/value of the message, and default properties for the producer. This steps can also verify if a message with the corresponding key and value already exists in the topic before inserting. Remember that for a consumer to be able to read all the messages from a topic, it must use a new group.id (not used before in that topic). When reading from a kafka topic with a consumer, kafka will return the next message by offset not read for that group.
Example: For sending a simple message (only specifying value)
Given I send a message 'hello' to the kafka topic named 'testqa'
Example: For sending a message with key and value (specifying kay and value)Given I send a message 'hello' to the kafka topic named 'testqa' with key 'keyvalue'
Example: To insert a message only if the exact key-value combination does not already exists in the topicGiven I send a message 'hello' to the kafka topic named 'testqa' with key 'keyvalue' if not exists
- Parameters:
message
- string that you send to topictopic_name
- topic namerecordKey
- the record keyifExists
- for handling optional text in Gherkin- Throws:
Exception
- Exception- See Also:
sendMessageToTopicWithProperties(String, String, String, DataTable)
,sendAvroMessageToTopicWithProperties(String, String, String, DataTable)
-
sendMessageToTopicWithProperties
@Given("I send a message \'(.+?)\' to the kafka topic named \'(.+?)\'( with key \'(.+?)\')? with:$") public void sendMessageToTopicWithProperties(String message, String topic_name, String recordKey, io.cucumber.datatable.DataTable table) throws InterruptedException, ExecutionException, TimeoutException
Sends a message to a Kafka topic with properties.Similar to the
sendMessageToTopic(String, String, String, String)
step, but gives the possibility to override the properties of the producer before sending a message. For example, the properties of the producer can be altered to change the default serializer for key/value before sending. In this case, for the key.serializer property the value should be "org.apache.kafka.common.serialization.StringSerializer" and for the value.serializer the value should be "org.apache.kafka.common.serialization.LongSerializer".The library will try to automatically cast the message to the type of the specified value.serializer property. So, for example, trying to send the message "hello" using LongSerializer for value.serializer will produce an error.
Example: For sending a message to topic 'longTopic' with key as String and value as Long
When I send a message '1234567890' to the kafka topic named 'longTopic' with: | key.serializer | org.apache.kafka.common.serialization.StringSerializer | | value.serializer | org.apache.kafka.common.serialization.LongSerializer |
Other common properties for a Kafka consumer: - bootstrap.servers (defaults to 0.0.0.0:9092) - acks (defaults to "all") - retries (defaults to 0) - batch.size (defaults to 16384) - linger.ms (defaults to 1) - buffer.memory (defaults to 33554432) - client.id (defaults to KafkaQAProducer)- Parameters:
message
- Message to send (will be converted to the proper type specified by the value.serializer prop). String is defaulttopic_name
- Name of the topic where to send the messagerecordKey
- Key of the kafka recordtable
- Table containing alternative properties for the producer- Throws:
InterruptedException
- InterruptedExceptionExecutionException
- ExecutionExceptionTimeoutException
- TimeoutException- See Also:
sendMessageToTopic(String, String, String, String)
,sendAvroMessageToTopicWithProperties(String, String, String, DataTable)
-
assertTopicDoesntExist
@Then("^A kafka topic named \'(.+?)\' does not exist") public void assertTopicDoesntExist(String topic_name) throws org.apache.zookeeper.KeeperException, InterruptedException
Check that a kafka topic does not existExample:
Then A kafka topic named 'testqa' does not exist
- Parameters:
topic_name
- name of topic- Throws:
org.apache.zookeeper.KeeperException
- KeeperExceptionInterruptedException
- InterruptedException- See Also:
createTopic(String, String)
,deleteTopic(String)
,modifyTopicPartitions(int, String)
,assertTopicExists(String)
-
checkNumberOfPartitions
@Then("^The number of partitions in topic \'(.+?)\' should be \'(.+?)\'\'?$") public void checkNumberOfPartitions(String topic_name, int numOfPartitions) throws Exception
Check that the number of partitions is the expected for the given topic.- Parameters:
topic_name
- Name of kafka topicnumOfPartitions
- Number of partitions- Throws:
Exception
- Exception- See Also:
modifyTopicPartitions(int, String)
-
assertTopicContainsMessage
@Then("^The kafka topic \'(.*?)\' has a message containing \'(.*?)\'( as key)?$") public void assertTopicContainsMessage(String topic, String content, String key) throws InterruptedException
Check if message existsPools the given topic for messages and checks if any have the given value. By default, this method uses String Serializer/Deserializer to read the messages from the topic (as well as all the default properties for the consumer).
Unless specified, the method will only look for records that contain the specific message in the value of the kafka record but it can also be used.
Example: Check if the topic contains the message with the given value
Then The kafka topic 'testqa' has a message containing 'hello'
Example: Check if the topic contains the message with the given keyThen The kafka topic 'testqa' has a message containing 'hello' as key
- Parameters:
topic
- Topic to pollcontent
- Value to look for (as String)key
- key of the record- Throws:
InterruptedException
- InterruptedException- See Also:
assertTopicContainsMessageWithProperties(String, String, String, DataTable)
,assertTopicContainsAvroMessageWithProperties(String, String, DataTable)
,assertTopicContainsPartialAvroMessageWithProperties(String, String, int, DataTable)
-
assertTopicExists
@Then("^A kafka topic named \'(.+?)\' exists") public void assertTopicExists(String topic_name) throws org.apache.zookeeper.KeeperException, InterruptedException
Check that a kafka topic existExample: Verify the topic 'testqa' exists
Then A kafka topic named 'testqa' exists
- Parameters:
topic_name
- name of topic- Throws:
org.apache.zookeeper.KeeperException
- KeeperExceptionInterruptedException
- InterruptedException- See Also:
createTopic(String, String)
,deleteTopic(String)
,assertTopicDoesntExist(String)
,modifyTopicPartitions(int, String)
-
setSchemaRegistryURL
@Given("^My schema registry is running at \'(.+)\'$") public void setSchemaRegistryURL(String host) throws Throwable
Sets URL of schema registry.Initializes the remote URL of the schema registry service for all future requests. Also sets the property schema.registry.url in the consumer and producer properties
Example: To set the schema registry at localhost:8081:
Given My schema registry is running at 'localhost:8081'
- Parameters:
host
- Remote host and port (defaults to http://0.0.0.0:8081)- Throws:
Throwable
- Throwable- See Also:
registerNewSchema(String, String)
,createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
-
registerNewSchema
@Then("^I register a new version of a schema under the subject \'(.+)\' with \'(.+)\'$") public void registerNewSchema(String subjectName, String filepath) throws Throwable
Adds a new schema to the schema register.Generates a POST to the schema register to add a new schema for the given subject
Example: Assuming the file located under schemas/recordSchema.avsc contains the following valid schema { "namespace": "com.mynamespace", "type": "record", "name": "Record", "fields": [ { "name": "str1", "type": "string" }, { "name": "str2", "type": "string" }, { "name": "int1", "type": "int" } ] } Then, to set it at the schema registry at localhost:8081:
Given My schema registry is running at 'localhost:8081' Then I register a new version of a schema under the subject 'record' with 'schemas/recordSchema.avsc'
- Parameters:
subjectName
- Name of the subject where register the new schemafilepath
- Path of the file containing the schema- Throws:
Throwable
- Throwable- See Also:
setSchemaRegistryURL(String)
,createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
-
assertTopicContainsMessageWithProperties
@Then("^The kafka topic \'(.+?)\' has a message containing \'(.+?)\'( as key)? with:$") public void assertTopicContainsMessageWithProperties(String topicName, String message, String isKey, io.cucumber.datatable.DataTable dataTable) throws Throwable
Reads message from topic with propertiesReads messages from the beginning of the topic with the specified properties for the consumer. The message is casted to the correct type based on the given value.deserializer property (uses String deserializer by default)
For example, the properties of the consumer can be altered to change the default deserializer for key/value when reading. In this case, for the key.deserializer property the value should be "org.apache.kafka.common.serialization.StringDeserializer" and for the value.deserializer the value should be "org.apache.kafka.common.serialization.LongDeserializer".
The library will try to automatically cast the message to the type of the specified value.deserializer property. So, for example, trying to read the message "hello" using LongSerializer for value.serializer will produce an error.
Example: Setting the read properties as String for key and Long for value before reading
Then The kafka topic 'longTopic' has a message containing '1234567890' with: | key.deserializer | org.apache.kafka.common.serialization.StringDeserializer | | value.deserializer | org.apache.kafka.common.serialization.LongDeserializer |
Other common properties for a Kafka consumer: - bootstrap.servers (defaults to 0.0.0.0:9092) - group.id (defaults to "test") - enable.auto.commit (defaults to true) - auto.offset.reset (defaults to 'earliest') - auto.commit.intervals.ms (defaults to 1000) - session.timeout (defaults to 10000)- Parameters:
topicName
- Name of the topic where to send the messagemessage
- Message to send (will be converted to the correct type according to the value.deserializer property)isKey
- the is keydataTable
- Table containing properties for consumer- Throws:
Throwable
- Throwable- See Also:
assertTopicContainsMessage(String, String, String)
,assertTopicContainsPartialAvroMessageWithProperties(String, String, int, DataTable)
,assertTopicContainsAvroMessageWithProperties(String, String, DataTable)
-
createNewAvroMessage
@Then("^I create the avro record \'(.+?)\' from the schema in \'(.+?)\'( based on \'(.+?)\')? with:$") public void createNewAvroMessage(String recordName, String schemaFile, String seedFile, io.cucumber.datatable.DataTable table) throws Throwable
Creates an Avro record from the specified schema.The record is created as a
GenericRecord
and the user can dynamically specify the properties values.Example: To create a new GenericRecord with name 'record' using the schema under schemas/recordSchema.avsc
Given My schema registry is running at 'localhost:8081' Then I register a new version of a schema under the subject 'record' with 'schemas/recordSchema.avsc' Then I create the avro record 'record' from the schema in 'schemas/recordSchema.avsc' with: | str1 | str1 | | str2 | str2 | | int1 | 1 |
The library will automatically cast the variables to the correct types specified in the *.avsc file. So, trying to insert an string for "int1" will generate an error- Parameters:
recordName
- Name of the Avro generic recordschemaFile
- File containing the schema of the messageseedFile
- the seed filetable
- Table contains the values for the fields on the schema. (Values will be converted according to field type)- Throws:
Throwable
- Throwable- See Also:
sendAvroMessageToTopicWithProperties(String, String, String, DataTable)
,createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
-
createNewAvroMessageFromRegistry
@Then("^I create the avro record \'(.+?)\' using version \'(.+?)\' of subject \'(.+?)\' from registry( based on \'(.+?)\')? with:$") public void createNewAvroMessageFromRegistry(String recordName, String versionNumber, String subject, String seedFile, io.cucumber.datatable.DataTable table) throws Throwable
Creates a new Avro record by reading the schema directly from the schema registry for the specified subject and version- Parameters:
recordName
- Name of the recordversionNumber
- Verison number of the schemasubject
- Subject nameseedFile
- Seed file to usetable
- Modifications datatable- Throws:
Throwable
- Throwable
-
sendAvroMessageToTopicWithProperties
@When("^I send the avro record \'(.+?)\' to the kafka topic \'(.+?)\'( with key \'(.+?)\')? with:$") public void sendAvroMessageToTopicWithProperties(String genericRecord, String topicName, String recordKey, io.cucumber.datatable.DataTable table) throws Throwable
Send the previously created Avro record.The value.serializer property for the producer is set to KafkaAvroSerializer automatically
Example:
When I send the avro record 'record' to the kafka topic 'avroTopic' with: | key.serializer | org.apache.kafka.common.serialization.StringSerializer |
- Parameters:
genericRecord
- Name of the record to sendtopicName
- Topic where to send the recordrecordKey
- Record keytable
- Table containing modifications for the producer properties- Throws:
Throwable
- Throwable- See Also:
createNewAvroMessage(String, String, String, DataTable)
,createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
-
assertTopicContainsAvroMessageWithProperties
@Then("^The kafka topic \'(.+?)\' has an avro message \'(.+?)\' with:$") public void assertTopicContainsAvroMessageWithProperties(String topicName, String avroRecord, io.cucumber.datatable.DataTable dataTable) throws Throwable
Search the topic for the given Avro record.Reads the specified topic from beginning for the specified avro record. The consumer value.deserializer property is automatically set to KafkaAvroDeserializer
Example:
Then The kafka topic 'avroTopic' has an avro message 'record' with: | key.deserializer | org.apache.kafka.common.serialization.StringDeserializer |
- Parameters:
topicName
- Topic to read fromavroRecord
- Name of the record to readdataTable
- Table containing modifications for the consumer properties- Throws:
Throwable
- Throwable- See Also:
createNewAvroMessage(String, String, String, DataTable)
,createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
,sendAvroMessageToTopicWithProperties(String, String, String, DataTable)
-
configureConsumerProperties
@Then("^I configure the kafka consumers with:$") public void configureConsumerProperties(io.cucumber.datatable.DataTable dataTable)
Modify consumer propertiesA single step for modifying the consumer properties for the rest of the scenario.
Example: To change consumer properties
Then I configure the kafka consumers with: | group.id | ${ID} | | key.deserializer | org.apache.kafka.common.serialization.StringDeserializer | | value.deserializer | org.apache.kafka.common.serialization.StringDeserializer | | bootstrap.servers | srdc1kafkassl5.privalia.pin:9092,srdc1kafkassl9.privalia.pin:9092,srdc1kafkassl10.privalia.pin:9092 |
- Parameters:
dataTable
- table with consumer properties- See Also:
configureProducerProperties(DataTable)
-
configureProducerProperties
@Then("^I configure the kafka producer with:$") public void configureProducerProperties(io.cucumber.datatable.DataTable dataTable)
Modify producer propertiesA single step for modifying the producer properties for the rest of the scenario.
Example: To change producer properties settings:
Then I configure the kafka producer with: | client.id | QAderInjector | | key.serializer | org.apache.kafka.common.serialization.StringSerializer | | value.serializer | org.apache.kafka.common.serialization.StringSerializer | | bootstrap.servers | srdc1kafkassl5.privalia.pin:9092,srdc1kafkassl9.privalia.pin:9092,srdc1kafkassl10.privalia.pin:9092 |
- Parameters:
dataTable
- table with consumer properties- See Also:
configureConsumerProperties(DataTable)
-
disconnectFromKafka
@Then("^I close the connection to kafka$") public void disconnectFromKafka() throws Throwable
Close the connection to kafka.Example:
Then I close the connection to kafka
- Throws:
Throwable
- the throwable- See Also:
connectToKafka(String, String)
-
assertTopicContainsPartialAvroMessageWithProperties
@And("^The kafka topic \'(.+?)\' has( at least)? \'(.+?)\' an avro message with:$") public void assertTopicContainsPartialAvroMessageWithProperties(String topicName, String atLeast, int expectedCount, io.cucumber.datatable.DataTable datatable) throws Throwable
Performs a partial property matching on the avro records returnedExample:
Then The kafka topic 'avroTopic' has at least '1' an avro message with: | user.id | Paul |
- Parameters:
topicName
- Name of the topic to read messages fromatLeast
- Indicates to find at least the expectedCount. If ignored, asserts the exact quantity is foundexpectedCount
- Expected amount of records to find that match the given conditionsdatatable
- Expected conditions- Throws:
Throwable
- the throwable- See Also:
assertTopicContainsAvroMessageWithProperties(String, String, DataTable)
-
-