Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Kafka messages sometimes remain unprocessed by a JMS source for unexplained reasons. To ensure that all the Kafka messages are consumed and published successfully, Adeptia Connect integrates Apache Kafka using which the application components can create, send, receive, and read messages using reliable communication.

To use Apache Kafka, you need to create a Kafka account and define a Kafka listener and a target. Kafka messages can be consumed and published by a template, transaction, or process flow. 

Creating Kafka Account

Kafka account is used to connect with Kafka Server. While creating Kafka account, you need to define the following properties which are used to connect to Kafka Server. 

  1. Click Configure > ACCOUNTS > Kafka.  

  2. Click Create Kafka account
  3. In Create Kafka Account window, do the following:



    1. In the Name and Description fields, enter the name and description respectively for the new Kafka account.
    2. In the Broker field, enter the URL of the Kafka brokers, foe example, host1:port1,host2:port2.
    3. In the Security Protocol field, enter the protocol used to communicate with brokers.

      The supported protocols are PLAINTEXT and SSL. 

      For SSL:

      • In case you are using SSL for client authentication, select SSL in the Security Protocol field and then select a Keystore in the Select Keystore field.

      • In case you are using SSL for server authentication, you must import the Kafka server certificate in Adeptia Connect Truststore. To import the certificate, click here.

    4. In the Addon Configuration window, enter the Add On Configurations to be used in back end for performing operation on Kafka server (e.g. param1=value,param2=value..).

      Here you can use properties which are not listed on this Kafka account creation interface, for example, you may need to use a property called Serializer that converts the data to bytes. You can find the properties available at this location.
    5. In the Select Project field, select the project.
    6. Click Save.

Creating Kafka Listener

Follow the steps below to create Kafka listener:

  1. Click Configure > EVENTS > Kafka Listener.
  2. Click Create Kafka Listener.
  3. In Create Kafka Listener window, do the following:



    1. In the Name and Description fields, enter the name and description respectively for the Kafka listener.
    2. In the Kafka Account field, select the Kafka account. You can create a new Kafka account by clicking .

      Click Test to check if you are able to connect with Kafka server using this Kafka account. 
    3. In the Topic(s) field, select the topic to be used. 

      You can select multiple topics.
    4. In the Consumer Group field, enter the string that uniquely identifies the group of consumer processes to which this consumer belongs. By setting the same consumer group multiple processes indicate that they are all part of the same consumer group.
    5. In the Key(s) to Filter Messages field, enter the Name of the message key to process. You can use comma to provide multiple message keys.
    6. In the Consumer Count field, enter the number of consumers that connect to Kafka server.
    7. In the Process Flow Name field, select the process flow.
    8. In the Auto Offset Reset field, select the value to define the offset when there is no initial offset in ZooKeeper or if an offset is out of range:
      • EARLIEST : Automatically reset the offset to the earliest offset
      • LATEST : Automatically reset the offset to the latest offset fail: throw exception to the consumer.
      • NONE
    9. Select the Enable Message Aggregator check box to define the message aggregator properties:
      1. In the Number of Messages for Aggregation field, enter the number of message should be aggregated.
      2. In the Aggregation Timespan field, enter the time in seconds, messages received within the provided timespan will be aggregated.
      3. In the Message Type field, select the message format.

        The supported message formats are JSON, XML, and PLAINTEXT.

      4. In the Message Joiner field, enter the delimiter based on which the messages will be aggregated.
        It depends on the Message Type.
        • XML : It will be an XPath.
        • PLAINTEXT : It will be a delimiter.
        • JSON : It can be empty or an element name.
    10. In the Seek To field, select if the Kafka Consumer will read the message from beginning or end on startup.
    11. In the Heartbeat Interval field, enter the heartbeat interval.
      The expected time between heartbeats to the consumer coordinator when using Kafka group management facilities. Heartbeats are used to ensure that the consumers session stays active and to facilitate rebalancing when new consumers join or leave the group. The value must be set lower than session timeout, but typically should be set no higher than 1/3 of that value. It can be adjusted even lower to control the expected time for normal rebalances.
    12. In the Session Timeout field, enter the time interval for session timeout.
      The timeout used to detect failures when using Kafka group management facilities.
    13. In the Select Project field, select the project.
    14. Click Save.

Creating Kafka Target

Follow the steps below to create Kafka target:

  1. Click Configure > TARGETS > Kafka Target.
  2. Click Create Kafka Target.
  3. In Create Kafka Target window, do the following:



    1. In the Name and Description fields, enter the name and description respectively for the Kafka target.
    2. In the Kafka Account field, select the Kafka account. 

    3. In the Topic(s) field, select the topic to be used. 

    4. In the Message Key field, enter the message key.
    5. Select the Enable Message Splitter check box to split the message by defining the below properties:
      1. In the Message Type field, select the message format.

        The supported message formats are JSON, XML, and PLAINTEXT.

      2. In the Message Splitter field, enter the delimiter based on which the messages will be splitted.
        It depends on the Message Type.

        • XML : It will be an element name of the record.
        • PLAINTEXT : It will be a delimiter.
        • JSON: It can be empty or an element name.
    6. In the Compression field, enter the Parameter that allows you to specify the compression codec for all data generated by this producer.






Services

Account

Listener

Activate/Deactivate Kafka Listener

Target

Using Kafka in a Template

Using Kafka in a Process Flow

Publish Process Flow to Consume messages from Kafka Server





  • No labels