Package com.privalia.qa.specs
Class KafkaGSpec
- java.lang.Object
-
- com.privalia.qa.specs.BaseGSpec
-
- com.privalia.qa.specs.KafkaGSpec
-
public class KafkaGSpec extends BaseGSpec
Steps definitions for working with Apache Kafka- Author:
- José Fernández
-
-
Field Summary
-
Fields inherited from class com.privalia.qa.specs.BaseGSpec
commonspec
-
-
Constructor Summary
Constructors Constructor Description KafkaGSpec(CommonG spec)
Instantiates a new Kafka g spec.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description void
checkMessages(String topic, String content, String key)
Pools the given topic for messages and checks if any have the given value.void
checkNumberOfPartitions(String topic_name, int numOfPartitions)
Check that the number of partitions is the expected for the given topic.void
connectKafka(String zkHost, String zkPath)
Connect to Kafka.void
createKafkaTopic(String topic_name, String ifExists)
Create a Kafka topic.void
deleteKafkaTopic(String topic_name)
Delete a Kafka topic.void
iCloseTheConnectionToKafka()
Close the connection to kafka.void
iConfigureConsumerProperties(io.cucumber.datatable.DataTable dataTable)
A single step for modifying the consumer properties for the rest of the scenario.void
iConfigureProducerProperties(io.cucumber.datatable.DataTable dataTable)
A single step for modifying the producer properties for the rest of the scenario.void
iCreateTheAvroRecordRecord(String recordName, String schemaFile, String seedFile, io.cucumber.datatable.DataTable table)
Creates an Avro record from the specified schema.void
iCreateTheAvroRecordRecordUsingVersionOfSubjectRecordFromRegistryWith(String recordName, String versionNumber, String subject, String seedFile, io.cucumber.datatable.DataTable table)
Creates a new Avro record by reading the schema directly from the schema registry for the specified subject and versionvoid
iRegisterANewVersionOfASchemaUnderTheSubject(String subjectName, String filepath)
Generates a POST to the schema register to add a new schema for the given subjectvoid
iSendTheAvroRecordRecordToTheKafkaTopic(String genericRecord, String topicName, String recordKey, io.cucumber.datatable.DataTable table)
Send the previously created Avro record to the given topic.void
kafkaTopicExist(String topic_name)
Check that a kafka topic existvoid
kafkaTopicNotExist(String topic_name)
Check that a kafka topic does not existvoid
modifyPartitions(int numPartitions, String topic_name)
Modify partitions in a Kafka topic by increasing the current number of partitions in the topic by the specified number.void
mySchemaRegistryIsRunningAtLocalhost(String host)
Initializes the remote URL of the schema registry service for all future requests.void
sendAMessage(String message, String topic_name, String recordKey, String ifExists)
Sends a message to a Kafka topic.void
sendAMessageWithDatatable(String message, String topic_name, String recordKey, io.cucumber.datatable.DataTable table)
Sends a message to a Kafka topic.void
theKafkaTopicAvroTopicHasAnAvroMessageRecordWith(String topicName, String avroRecord, io.cucumber.datatable.DataTable dataTable)
Reads the specified topic from beginning for the specified avro record.void
theKafkaTopicHasAnAvroMessageWith(String topicName, String atLeast, int expectedCount, io.cucumber.datatable.DataTable datatable)
Performs a partial property matching on the avro records returnedvoid
theKafkaTopicStringTopicHasAMessageContainingHelloWith(String topicName, String message, String isKey, io.cucumber.datatable.DataTable dataTable)
Reads messages from the beginning of the topic with the specified properties for the consumer.-
Methods inherited from class com.privalia.qa.specs.BaseGSpec
getCommonSpec
-
-
-
-
Constructor Detail
-
KafkaGSpec
public KafkaGSpec(CommonG spec)
Instantiates a new Kafka g spec.- Parameters:
spec
- the spec
-
-
Method Detail
-
connectKafka
@Given("^I connect to kafka at \'(.+?)\'( using path \'(.+?)\')?$") public void connectKafka(String zkHost, String zkPath) throws UnknownHostException
Connect to Kafka.- Parameters:
zkHost
- ZK hostzkPath
- ZK port- Throws:
UnknownHostException
- exception
-
createKafkaTopic
@When("^I create a Kafka topic named \'(.+?)\'( if it doesn\'t exists)?") public void createKafkaTopic(String topic_name, String ifExists) throws Exception
Create a Kafka topic.- Parameters:
topic_name
- topic nameifExists
- String fro matching optional text in Gherkin- Throws:
Exception
- Exception
-
deleteKafkaTopic
@When("^I delete a Kafka topic named \'(.+?)\'") public void deleteKafkaTopic(String topic_name) throws Exception
Delete a Kafka topic.- Parameters:
topic_name
- topic name- Throws:
Exception
- Exception
-
modifyPartitions
@When("^I increase \'(.+?)\' partitions in a Kafka topic named \'(.+?)\'") public void modifyPartitions(int numPartitions, String topic_name) throws Exception
Modify partitions in a Kafka topic by increasing the current number of partitions in the topic by the specified number. Mind that the number of partitions for a topic can only be increased once its created- Parameters:
numPartitions
- number of partitions to add to the current amount of partitions for the topictopic_name
- topic name- Throws:
Exception
- Exception
-
sendAMessage
@When("^I send a message \'(.+?)\' to the kafka topic named \'(.+?)\'( with key \'(.+?)\')?( if not exists)?$") public void sendAMessage(String message, String topic_name, String recordKey, String ifExists) throws Exception
Sends a message to a Kafka topic. By default, this steps uses StringSerializer and StringDeserializer for the key/value of the message, and default properties for the producer. This steps can also verify if a message with the corresponding key and value already exists in the topic before inserting. Remember that for a consumer to be able to read all the messages from a topic, it must use a new group.id (not used before in that topic)- Parameters:
message
- string that you send to topictopic_name
- topic namerecordKey
- the record keyifExists
- for handling optional text in Gherkin- Throws:
Exception
- Exception
-
sendAMessageWithDatatable
@Given("I send a message \'(.+?)\' to the kafka topic named \'(.+?)\'( with key \'(.+?)\')? with:$") public void sendAMessageWithDatatable(String message, String topic_name, String recordKey, io.cucumber.datatable.DataTable table) throws InterruptedException, ExecutionException, TimeoutException
Sends a message to a Kafka topic. This steps allows to modify any property of the producer before sending- Parameters:
message
- Message to send (will be converted to the proper type specified by the value.serializer prop). String is defaulttopic_name
- Name of the topic where to send the messagerecordKey
- Key of the kafka recordtable
- Table containing alternative properties for the producer- Throws:
InterruptedException
- InterruptedExceptionExecutionException
- ExecutionExceptionTimeoutException
- TimeoutException
-
kafkaTopicNotExist
@Then("^A kafka topic named \'(.+?)\' does not exist") public void kafkaTopicNotExist(String topic_name) throws org.apache.zookeeper.KeeperException, InterruptedException
Check that a kafka topic does not exist- Parameters:
topic_name
- name of topic- Throws:
org.apache.zookeeper.KeeperException
- KeeperExceptionInterruptedException
- InterruptedException
-
checkNumberOfPartitions
@Then("^The number of partitions in topic \'(.+?)\' should be \'(.+?)\'\'?$") public void checkNumberOfPartitions(String topic_name, int numOfPartitions) throws Exception
Check that the number of partitions is the expected for the given topic.- Parameters:
topic_name
- Name of kafka topicnumOfPartitions
- Number of partitions- Throws:
Exception
- Exception
-
checkMessages
@Then("^The kafka topic \'(.*?)\' has a message containing \'(.*?)\'( as key)?$") public void checkMessages(String topic, String content, String key) throws InterruptedException
Pools the given topic for messages and checks if any have the given value. By default, this method uses String Serializer/Deserializer to read the messages from the topic (as well as all the default properties for the consumer)- Parameters:
topic
- Topic to pollcontent
- Value to look for (as String)key
- key of the record- Throws:
InterruptedException
- InterruptedException
-
kafkaTopicExist
@Then("^A kafka topic named \'(.+?)\' exists") public void kafkaTopicExist(String topic_name) throws org.apache.zookeeper.KeeperException, InterruptedException
Check that a kafka topic exist- Parameters:
topic_name
- name of topic- Throws:
org.apache.zookeeper.KeeperException
- KeeperExceptionInterruptedException
- InterruptedException
-
mySchemaRegistryIsRunningAtLocalhost
@Given("^My schema registry is running at \'(.+)\'$") public void mySchemaRegistryIsRunningAtLocalhost(String host) throws Throwable
Initializes the remote URL of the schema registry service for all future requests. Also sets the property schema.registry.url in the consumer and producer properties- Parameters:
host
- Remote host and port (defaults to http://0.0.0.0:8081)- Throws:
Throwable
- Throwable
-
iRegisterANewVersionOfASchemaUnderTheSubject
@Then("^I register a new version of a schema under the subject \'(.+)\' with \'(.+)\'$") public void iRegisterANewVersionOfASchemaUnderTheSubject(String subjectName, String filepath) throws Throwable
Generates a POST to the schema register to add a new schema for the given subject- Parameters:
subjectName
- Name of the subject where register the new schemafilepath
- Path of the file containing the schema- Throws:
Throwable
- Throwable
-
theKafkaTopicStringTopicHasAMessageContainingHelloWith
@Then("^The kafka topic \'(.+?)\' has a message containing \'(.+?)\'( as key)? with:$") public void theKafkaTopicStringTopicHasAMessageContainingHelloWith(String topicName, String message, String isKey, io.cucumber.datatable.DataTable dataTable) throws Throwable
Reads messages from the beginning of the topic with the specified properties for the consumer. The message is casted to the correct type based on the given value.deserializer property (uses String deserializer by default)- Parameters:
topicName
- Name of the topic where to send the messagemessage
- Message to send (will be converted to the correct type according to the value.deserializer property)isKey
- the is keydataTable
- Table containing properties for consumer- Throws:
Throwable
- Throwable
-
iCreateTheAvroRecordRecord
@Then("^I create the avro record \'(.+?)\' from the schema in \'(.+?)\'( based on \'(.+?)\')? with:$") public void iCreateTheAvroRecordRecord(String recordName, String schemaFile, String seedFile, io.cucumber.datatable.DataTable table) throws Throwable
Creates an Avro record from the specified schema. The record is created as aGenericRecord
- Parameters:
recordName
- Name of the Avro generic recordschemaFile
- File containing the schema of the messageseedFile
- the seed filetable
- Table containen the values for the fields on the schema. (Values will be converted according to field type)- Throws:
Throwable
- Throwable
-
iCreateTheAvroRecordRecordUsingVersionOfSubjectRecordFromRegistryWith
@Then("^I create the avro record \'(.+?)\' using version \'(.+?)\' of subject \'(.+?)\' from registry( based on \'(.+?)\')? with:$") public void iCreateTheAvroRecordRecordUsingVersionOfSubjectRecordFromRegistryWith(String recordName, String versionNumber, String subject, String seedFile, io.cucumber.datatable.DataTable table) throws Throwable
Creates a new Avro record by reading the schema directly from the schema registry for the specified subject and version- Parameters:
recordName
- Name of the recordversionNumber
- Verison number of the schemasubject
- Subject nameseedFile
- Seed file to usetable
- Modifications datatable- Throws:
Throwable
- Throwable
-
iSendTheAvroRecordRecordToTheKafkaTopic
@When("^I send the avro record \'(.+?)\' to the kafka topic \'(.+?)\'( with key \'(.+?)\')? with:$") public void iSendTheAvroRecordRecordToTheKafkaTopic(String genericRecord, String topicName, String recordKey, io.cucumber.datatable.DataTable table) throws Throwable
Send the previously created Avro record to the given topic. The value.serializer property for the producer is set to KafkaAvroSerializer automatically- Parameters:
genericRecord
- Name of the record to sendtopicName
- Topic where to send the recordrecordKey
- Record keytable
- Table containing modifications for the producer properties- Throws:
Throwable
- Throwable
-
theKafkaTopicAvroTopicHasAnAvroMessageRecordWith
@Then("^The kafka topic \'(.+?)\' has an avro message \'(.+?)\' with:$") public void theKafkaTopicAvroTopicHasAnAvroMessageRecordWith(String topicName, String avroRecord, io.cucumber.datatable.DataTable dataTable) throws Throwable
Reads the specified topic from beginning for the specified avro record. The consumer value.deserializer property is automatically set to KafkaAvroDeserializer- Parameters:
topicName
- Topic to read fromavroRecord
- Name of the record to readdataTable
- Table containing modifications for the consumer properties- Throws:
Throwable
- Throwable
-
iConfigureConsumerProperties
@Then("^I configure the kafka consumers with:$") public void iConfigureConsumerProperties(io.cucumber.datatable.DataTable dataTable)
A single step for modifying the consumer properties for the rest of the scenario.- Parameters:
dataTable
- table with consumer properties
-
iConfigureProducerProperties
@Then("^I configure the kafka producer with:$") public void iConfigureProducerProperties(io.cucumber.datatable.DataTable dataTable)
A single step for modifying the producer properties for the rest of the scenario.- Parameters:
dataTable
- table with consumer properties
-
iCloseTheConnectionToKafka
@Then("^I close the connection to kafka$") public void iCloseTheConnectionToKafka() throws Throwable
Close the connection to kafka.- Throws:
Throwable
- the throwable
-
theKafkaTopicHasAnAvroMessageWith
@And("^The kafka topic \'(.+?)\' has( at least)? \'(.+?)\' an avro message with:$") public void theKafkaTopicHasAnAvroMessageWith(String topicName, String atLeast, int expectedCount, io.cucumber.datatable.DataTable datatable) throws Throwable
Performs a partial property matching on the avro records returned- Parameters:
topicName
- Name of the topic to read messages fromatLeast
- Indicates to find at least the expectedCount. If ignored, asserts the exact quantity is foundexpectedCount
- Expected amount of records to find that match the given conditionsdatatable
- Expected conditions- Throws:
Throwable
- the throwable
-
-