Class KafkaGSpec


  • public class KafkaGSpec
    extends BaseGSpec
    Steps definitions for working with Apache Kafka
    Author:
    José Fernández
    • Constructor Detail

      • KafkaGSpec

        public KafkaGSpec​(CommonG spec)
        Instantiates a new Kafka g spec.
        Parameters:
        spec - the spec
    • Method Detail

      • connectToKafka

        @Given("^I connect to kafka at \'(.+?)\'( using path \'(.+?)\')?$")
        public void connectToKafka​(String zkHost,
                                   String zkPath)
                            throws UnknownHostException
        Connect to Kafka.

        Establish the connection to the Kafka cluster via the IP of the Zookeeper service. This is an initialization step necessary for all future steps

         Example: Assuming zookeeper service is running at localhost:2181
         
              Given I connect to kafka at 'localhost:2181'
         
         
        Parameters:
        zkHost - ZK host
        zkPath - ZK port
        Throws:
        UnknownHostException - exception
        See Also:
        disconnectFromKafka()
      • createTopic

        @When("^I create a Kafka topic named \'(.+?)\'( if it doesn\'t exists)?")
        public void createTopic​(String topic_name,
                                String ifExists)
                         throws Exception
        Create a Kafka topic.

        Creates a kafka topic with the given name. It can also create the topic only if it doesn't exists. All topics are created by default with 1 partition and a replication factor of 1

         Example: Create the topic 'testqa'
         
              Given I create a Kafka topic named 'testqa'
         
         Example: Create the topic 'testqa' if it doesn't exists
         
              Given I create a Kafka topic named 'testqa' if it doesn't exists
         
         
        Parameters:
        topic_name - topic name
        ifExists - String fro matching optional text in Gherkin
        Throws:
        Exception - Exception
        See Also:
        deleteTopic(String), assertTopicExists(String), modifyTopicPartitions(int, String)
      • modifyTopicPartitions

        @When("^I increase \'(.+?)\' partitions in a Kafka topic named \'(.+?)\'")
        public void modifyTopicPartitions​(int numPartitions,
                                          String topic_name)
                                   throws Exception
        Increase partitions in kafka topic

        Modify partitions in a Kafka topic by increasing the current number of partitions in the topic by the specified number. Mind that the number of partitions for a topic can only be increased once its created

         Example: Increase partitions topic 'testqa'
         
              Given I increase '1' partitions in a Kafka topic named 'testqa'
         
         
        Parameters:
        numPartitions - number of partitions to add to the current amount of partitions for the topic
        topic_name - topic name
        Throws:
        Exception - Exception
        See Also:
        deleteTopic(String), createTopic(String, String), modifyTopicPartitions(int, String)
      • sendMessageToTopic

        @When("^I send a message \'(.+?)\' to the kafka topic named \'(.+?)\'( with key \'(.+?)\')?( if not exists)?$")
        public void sendMessageToTopic​(String message,
                                       String topic_name,
                                       String recordKey,
                                       String ifExists)
                                throws Exception
        Sends a message to a Kafka topic.

        By default, this steps uses StringSerializer and StringDeserializer for the key/value of the message, and default properties for the producer. This steps can also verify if a message with the corresponding key and value already exists in the topic before inserting. Remember that for a consumer to be able to read all the messages from a topic, it must use a new group.id (not used before in that topic). When reading from a kafka topic with a consumer, kafka will return the next message by offset not read for that group.

         Example: For sending a simple message (only specifying value)
         
              Given I send a message 'hello' to the kafka topic named 'testqa'
         
         Example: For sending a message with key and value (specifying kay and value)
         
              Given I send a message 'hello' to the kafka topic named 'testqa' with key 'keyvalue'
         
         Example: To insert a message only if the exact key-value combination does not already exists in the topic
         
              Given I send a message 'hello' to the kafka topic named 'testqa' with key 'keyvalue' if not exists
         
         
        Parameters:
        message - string that you send to topic
        topic_name - topic name
        recordKey - the record key
        ifExists - for handling optional text in Gherkin
        Throws:
        Exception - Exception
        See Also:
        sendMessageToTopicWithProperties(String, String, String, DataTable), sendAvroMessageToTopicWithProperties(String, String, String, DataTable)
      • sendMessageToTopicWithProperties

        @Given("I send a message \'(.+?)\' to the kafka topic named \'(.+?)\'( with key \'(.+?)\')? with:$")
        public void sendMessageToTopicWithProperties​(String message,
                                                     String topic_name,
                                                     String recordKey,
                                                     io.cucumber.datatable.DataTable table)
                                              throws InterruptedException,
                                                     ExecutionException,
                                                     TimeoutException
        Sends a message to a Kafka topic with properties.

        Similar to the sendMessageToTopic(String, String, String, String) step, but gives the possibility to override the properties of the producer before sending a message. For example, the properties of the producer can be altered to change the default serializer for key/value before sending. In this case, for the key.serializer property the value should be "org.apache.kafka.common.serialization.StringSerializer" and for the value.serializer the value should be "org.apache.kafka.common.serialization.LongSerializer".

        The library will try to automatically cast the message to the type of the specified value.serializer property. So, for example, trying to send the message "hello" using LongSerializer for value.serializer will produce an error.

         Example: For sending a message to topic 'longTopic' with key as String and value as Long
         
              When I send a message '1234567890' to the kafka topic named 'longTopic' with:
                  | key.serializer    | org.apache.kafka.common.serialization.StringSerializer |
                  | value.serializer  | org.apache.kafka.common.serialization.LongSerializer   |
         
        
         Other common properties for a Kafka consumer:
         - bootstrap.servers (defaults to 0.0.0.0:9092)
         - acks (defaults to "all")
         - retries (defaults to 0)
         - batch.size (defaults to 16384)
         - linger.ms (defaults to 1)
         - buffer.memory (defaults to 33554432)
         - client.id (defaults to KafkaQAProducer)
         
        Parameters:
        message - Message to send (will be converted to the proper type specified by the value.serializer prop). String is default
        topic_name - Name of the topic where to send the message
        recordKey - Key of the kafka record
        table - Table containing alternative properties for the producer
        Throws:
        InterruptedException - InterruptedException
        ExecutionException - ExecutionException
        TimeoutException - TimeoutException
        See Also:
        sendMessageToTopic(String, String, String, String), sendAvroMessageToTopicWithProperties(String, String, String, DataTable)
      • checkNumberOfPartitions

        @Then("^The number of partitions in topic \'(.+?)\' should be \'(.+?)\'\'?$")
        public void checkNumberOfPartitions​(String topic_name,
                                            int numOfPartitions)
                                     throws Exception
        Check that the number of partitions is the expected for the given topic.
        Parameters:
        topic_name - Name of kafka topic
        numOfPartitions - Number of partitions
        Throws:
        Exception - Exception
        See Also:
        modifyTopicPartitions(int, String)
      • setSchemaRegistryURL

        @Given("^My schema registry is running at \'(.+)\'$")
        public void setSchemaRegistryURL​(String host)
                                  throws Throwable
        Sets URL of schema registry.

        Initializes the remote URL of the schema registry service for all future requests. Also sets the property schema.registry.url in the consumer and producer properties

         Example: To set the schema registry at localhost:8081:
         
              Given My schema registry is running at 'localhost:8081'
         
         
        Parameters:
        host - Remote host and port (defaults to http://0.0.0.0:8081)
        Throws:
        Throwable - Throwable
        See Also:
        registerNewSchema(String, String), createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
      • registerNewSchema

        @Then("^I register a new version of a schema under the subject \'(.+)\' with \'(.+)\'$")
        public void registerNewSchema​(String subjectName,
                                      String filepath)
                               throws Throwable
        Adds a new schema to the schema register.

        Generates a POST to the schema register to add a new schema for the given subject

         Example: Assuming the file located under schemas/recordSchema.avsc contains the following valid schema
        
         {
             "namespace": "com.mynamespace",
             "type": "record",
             "name": "Record",
             "fields": [
                 { "name": "str1", "type": "string" },
                 { "name": "str2", "type": "string" },
                 { "name": "int1", "type": "int" }
              ]
         }
        
         Then, to set it at the schema registry at localhost:8081:
         
              Given My schema registry is running at 'localhost:8081'
              Then I register a new version of a schema under the subject 'record' with 'schemas/recordSchema.avsc'
         
         
        Parameters:
        subjectName - Name of the subject where register the new schema
        filepath - Path of the file containing the schema
        Throws:
        Throwable - Throwable
        See Also:
        setSchemaRegistryURL(String), createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
      • assertTopicContainsMessageWithProperties

        @Then("^The kafka topic \'(.+?)\' has a message containing \'(.+?)\'( as key)? with:$")
        public void assertTopicContainsMessageWithProperties​(String topicName,
                                                             String message,
                                                             String isKey,
                                                             io.cucumber.datatable.DataTable dataTable)
                                                      throws Throwable
        Reads message from topic with properties

        Reads messages from the beginning of the topic with the specified properties for the consumer. The message is casted to the correct type based on the given value.deserializer property (uses String deserializer by default)

        For example, the properties of the consumer can be altered to change the default deserializer for key/value when reading. In this case, for the key.deserializer property the value should be "org.apache.kafka.common.serialization.StringDeserializer" and for the value.deserializer the value should be "org.apache.kafka.common.serialization.LongDeserializer".

        The library will try to automatically cast the message to the type of the specified value.deserializer property. So, for example, trying to read the message "hello" using LongSerializer for value.serializer will produce an error.

         Example: Setting the read properties as String for key and Long for value before reading
         
              Then The kafka topic 'longTopic' has a message containing '1234567890' with:
                  | key.deserializer    | org.apache.kafka.common.serialization.StringDeserializer |
                  | value.deserializer  | org.apache.kafka.common.serialization.LongDeserializer   |
         
        
         Other common properties for a Kafka consumer:
        
         - bootstrap.servers (defaults to 0.0.0.0:9092)
         - group.id (defaults to "test")
         - enable.auto.commit (defaults to true)
         - auto.offset.reset (defaults to 'earliest')
         - auto.commit.intervals.ms (defaults to 1000)
         - session.timeout (defaults to 10000)
        
         
        Parameters:
        topicName - Name of the topic where to send the message
        message - Message to send (will be converted to the correct type according to the value.deserializer property)
        isKey - the is key
        dataTable - Table containing properties for consumer
        Throws:
        Throwable - Throwable
        See Also:
        assertTopicContainsMessage(String, String, String), assertTopicContainsPartialAvroMessageWithProperties(String, String, int, DataTable), assertTopicContainsAvroMessageWithProperties(String, String, DataTable)
      • createNewAvroMessage

        @Then("^I create the avro record \'(.+?)\' from the schema in \'(.+?)\'( based on \'(.+?)\')? with:$")
        public void createNewAvroMessage​(String recordName,
                                         String schemaFile,
                                         String seedFile,
                                         io.cucumber.datatable.DataTable table)
                                  throws Throwable
        Creates an Avro record from the specified schema.

        The record is created as a GenericRecord and the user can dynamically specify the properties values.

         Example: To create a new GenericRecord with name 'record' using the schema under schemas/recordSchema.avsc
         
              Given My schema registry is running at 'localhost:8081'
              Then I register a new version of a schema under the subject 'record' with 'schemas/recordSchema.avsc'
              Then I create the avro record 'record' from the schema in 'schemas/recordSchema.avsc' with:
                  | str1    | str1 |
                  | str2    | str2 |
                  | int1    |   1  |
         
        
         The library will automatically cast the variables to the correct types specified in the *.avsc file. So, trying
         to insert an string for "int1" will generate an error
         
        Parameters:
        recordName - Name of the Avro generic record
        schemaFile - File containing the schema of the message
        seedFile - the seed file
        table - Table contains the values for the fields on the schema. (Values will be converted according to field type)
        Throws:
        Throwable - Throwable
        See Also:
        sendAvroMessageToTopicWithProperties(String, String, String, DataTable), createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
      • createNewAvroMessageFromRegistry

        @Then("^I create the avro record \'(.+?)\' using version \'(.+?)\' of subject \'(.+?)\' from registry( based on \'(.+?)\')? with:$")
        public void createNewAvroMessageFromRegistry​(String recordName,
                                                     String versionNumber,
                                                     String subject,
                                                     String seedFile,
                                                     io.cucumber.datatable.DataTable table)
                                              throws Throwable
        Creates a new Avro record by reading the schema directly from the schema registry for the specified subject and version
        Parameters:
        recordName - Name of the record
        versionNumber - Verison number of the schema
        subject - Subject name
        seedFile - Seed file to use
        table - Modifications datatable
        Throws:
        Throwable - Throwable
      • sendAvroMessageToTopicWithProperties

        @When("^I send the avro record \'(.+?)\' to the kafka topic \'(.+?)\'( with key \'(.+?)\')? with:$")
        public void sendAvroMessageToTopicWithProperties​(String genericRecord,
                                                         String topicName,
                                                         String recordKey,
                                                         io.cucumber.datatable.DataTable table)
                                                  throws Throwable
        Send the previously created Avro record.

        The value.serializer property for the producer is set to KafkaAvroSerializer automatically

         Example:
         
              When I send the avro record 'record' to the kafka topic 'avroTopic' with:
                  | key.serializer    | org.apache.kafka.common.serialization.StringSerializer |
         
         
        Parameters:
        genericRecord - Name of the record to send
        topicName - Topic where to send the record
        recordKey - Record key
        table - Table containing modifications for the producer properties
        Throws:
        Throwable - Throwable
        See Also:
        createNewAvroMessage(String, String, String, DataTable), createNewAvroMessageFromRegistry(String, String, String, String, DataTable)
      • configureConsumerProperties

        @Then("^I configure the kafka consumers with:$")
        public void configureConsumerProperties​(io.cucumber.datatable.DataTable dataTable)
        Modify consumer properties

        A single step for modifying the consumer properties for the rest of the scenario.

         Example: To change consumer properties
         
              Then I configure the kafka consumers with:
                  | group.id           | ${ID}                                                                                               |
                  | key.deserializer   | org.apache.kafka.common.serialization.StringDeserializer                                            |
                  | value.deserializer | org.apache.kafka.common.serialization.StringDeserializer                                            |
                  | bootstrap.servers  | srdc1kafkassl5.privalia.pin:9092,srdc1kafkassl9.privalia.pin:9092,srdc1kafkassl10.privalia.pin:9092 |
         
         
        Parameters:
        dataTable - table with consumer properties
        See Also:
        configureProducerProperties(DataTable)
      • configureProducerProperties

        @Then("^I configure the kafka producer with:$")
        public void configureProducerProperties​(io.cucumber.datatable.DataTable dataTable)
        Modify producer properties

        A single step for modifying the producer properties for the rest of the scenario.

         Example: To change producer properties settings:
         
              Then I configure the kafka producer with:
                  | client.id         | QAderInjector                                                                                       |
                  | key.serializer    | org.apache.kafka.common.serialization.StringSerializer                                              |
                  | value.serializer  | org.apache.kafka.common.serialization.StringSerializer                                              |
                  | bootstrap.servers | srdc1kafkassl5.privalia.pin:9092,srdc1kafkassl9.privalia.pin:9092,srdc1kafkassl10.privalia.pin:9092 |
         
         
        Parameters:
        dataTable - table with consumer properties
        See Also:
        configureConsumerProperties(DataTable)
      • disconnectFromKafka

        @Then("^I close the connection to kafka$")
        public void disconnectFromKafka()
                                 throws Throwable
        Close the connection to kafka.
         Example:
         
              Then I close the connection to kafka
         
         
        Throws:
        Throwable - the throwable
        See Also:
        connectToKafka(String, String)
      • assertTopicContainsPartialAvroMessageWithProperties

        @And("^The kafka topic \'(.+?)\' has( at least)? \'(.+?)\' an avro message with:$")
        public void assertTopicContainsPartialAvroMessageWithProperties​(String topicName,
                                                                        String atLeast,
                                                                        int expectedCount,
                                                                        io.cucumber.datatable.DataTable datatable)
                                                                 throws Throwable
        Performs a partial property matching on the avro records returned
         Example:
         
              Then The kafka topic 'avroTopic' has at least '1' an avro message with:
                  | user.id    | Paul |
         
         
        Parameters:
        topicName - Name of the topic to read messages from
        atLeast - Indicates to find at least the expectedCount. If ignored, asserts the exact quantity is found
        expectedCount - Expected amount of records to find that match the given conditions
        datatable - Expected conditions
        Throws:
        Throwable - the throwable
        See Also:
        assertTopicContainsAvroMessageWithProperties(String, String, DataTable)