Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

...

Apache Kafka is a distributed streaming platform that is highly scalable and secure, and it can:

  • Consume and publish messages/event streams, similar to an enterprise messaging system.
  • Store the messages for as long as you want.
  • Process the messages/event streams as they occur.

To connect with Kafka, you need to create a Kafka account and define a Kafka listener or a Kafka target. Kafka messages can be consumed and published by a template, transaction, or process flow. 

This page contains the following information:

Table of Contents
maxLevel3

Creating Kafka Account

Kafka account is used to connect with Kafka Server. While creating Kafka account, you need to define the following properties which are used to connect to Kafka Server. 

  1. Click Configure > ACCOUNTS > Kafka.  

  2. Click Create Kafka account
  3. In Create Kafka Account window, do the following:

    Image Modified

    1. In the Name and Description fields, enter the name and description respectively for the new Kafka account.
    2. In the Broker field, enter the URL of the Kafka brokers, foe example, host1:port1,host2:port2.
    3. In the Security Protocol field, enter the protocol used to communicate with brokers.

      Tip

      The supported protocols are PLAINTEXT and SSL. 

      For SSL:

      • In case you are using SSL for client authentication, select SSL in the Security Protocol field and then select a Keystore in the Select Keystore field.

      • In case you are using SSL for server authentication, you must import the Kafka server certificate in Adeptia Connect Truststore. To import the certificate, click here.


    4. In the Addon Configuration window, enter the Add On Configurations to be used in back end for performing operation on Kafka server (e.g. param1=value,param2=value..).

      Note
      Here you can use properties which are not listed on this Kafka account creation interface, for example, you may need to use a property called Serializer that converts the data to bytes. You can find the properties available at this location.


    5. In the Select Project field, select the project.
    6. Click Save.

Anchor
Creating Kafka Listener
Creating Kafka Listener
Creating Kafka Listener

Follow the steps below to create Kafka listener:

  1. Click Configure > EVENTS > Kafka Listener.
  2. Click Create Kafka Listener.
  3. In Create Kafka Listener window, do the following:

    Image Modified

    1. In the Name and Description fields, enter the name and description respectively for the Kafka listener.
    2. In the Kafka Account field, select the Kafka account. You can create a new Kafka account by clicking Image Modified.

      Note
      Click Test to check if you are able to connect with Kafka server using this Kafka account. 


    3. Anchor
      StepC
      StepC
      In the Topic(s) field, select the topic to be used. 

      Tip
      You can select multiple topics.


    4. In the Consumer Group field, enter the string that uniquely identifies the group of consumer processes to which this consumer belongs. By setting the same consumer group multiple processes indicate that they are all part of the same consumer group.
    5. In the Key(s) to Filter Messages field, enter the Name of the message key to process. You can use comma to provide multiple message keys.
    6. In the Consumer Count field, enter the number of consumers that connect to Kafka server.
    7. In the Process Flow Name field, select the process flow.
    8. In the Auto Offset Reset field, select the value to define the offset when there is no initial offset in ZooKeeper or if an offset is out of range:
      • EARLIEST : Automatically reset the offset to the earliest offset
      • LATEST : Automatically reset the offset to the latest offset fail: throw exception to the consumer.
      • NONE
    9. Select the Enable Message Aggregator check box to define the message aggregator properties:
      1. In the Number of Messages for Aggregation field, enter the number of message should be aggregated.
      2. In the Aggregation Timespan field, enter the time in seconds, messages received within the provided timespan will be aggregated.
      3. In the Message Type field, select the message format.

        Tip

        The supported message formats are JSON, XML, and PLAINTEXT.


      4. In the Message Joiner field, enter the delimiter based on which the messages will be aggregated.
        It depends on the Message Type.
        • XML : It will be an XPath.
        • PLAINTEXT : It will be a delimiter.
        • JSON : It can be empty or an element name.
    10. In the Seek To field, select if the Kafka Consumer will read the message from beginning or end on startup.
    11. In the Heartbeat Interval field, enter the heartbeat interval.
      The expected time between heartbeats to the consumer coordinator when using Kafka group management facilities. Heartbeats are used to ensure that the consumers session stays active and to facilitate rebalancing when new consumers join or leave the group. The value must be set lower than session timeout, but typically should be set no higher than 1/3 of that value. It can be adjusted even lower to control the expected time for normal rebalances.
    12. In the Session Timeout field, enter the time interval for session timeout.
      The timeout used to detect failures when using Kafka group management facilities.
    13. In the Select Project field, select the project.
    14. Click Save.

      Note
      • Kafka Listener and Target created from the manage page are available to be used only in the process flow. Every time you create a template with Kafka as a source, you need to define the properties for the Kafka listener. The same holds true when you create a template with Kafka as a destination. 
      • After you create a Kafka Listener, you may need to configure the property abpm.services.kafka.messages.location to specify the location where you want to save the Kafka messages. Defining this property is a one time activity – you need not define this property every time you create a Listener.


Anchor
Activating or deactivating the Kafka Listener
Activating or deactivating the Kafka Listener
Activating or deactivating the Kafka Listener

You can activate or deactivate a Kafka Listener by clicking Image Modified on the Kafka listener manage page.

Note
Activate or Deactivate options will be available only for the Kafka Listeners that have been created from the manage page and published as a process flow.
  • Once activated, the listener will start consuming the messages from Kafka Server and trigger the process flow for the received message.
  • Once deactivated, the listener will stop receiving Kafka messages from server. 

Anchor
Creating Kafka Target
Creating Kafka Target
Creating Kafka Target

Follow the steps below to create Kafka target:

  1. Click Configure > TARGETS > Kafka Target.
  2. Click Create Kafka Target.
  3. In Create Kafka Target window, do the following:

    Image Modified

    1. In the Name and Description fields, enter the name and description respectively for the Kafka target.
    2. In the Kafka Account field, select the Kafka account. 

    3. Anchor
      Topic
      Topic
      In the Topic(s) field, select the topic to be used. 

    4. In the Message Key field, enter the message key.
    5. Select the Enable Message Splitter check box to split the message by defining the below properties:
      1. In the Message Type field, select the message format.

        Tip

        The supported message formats are JSON, XML, and PLAINTEXT.


      2. In the Message Splitter field, enter the delimiter based on which the messages will be splitted.
        It depends on the Message Type.

        • XML : It will be an element name of the record.
        • PLAINTEXT : It will be a delimiter.
        • JSON: It can be empty or an element name.
    6. In the Compression field, enter the Parameter that allows you to specify the compression codec for all data generated by this producer.
    7. In the Request Required Acks field, select the number of acknowledgments the producer requires the leader to have received before considering a request complete.

      This controls the durability of records that are sent. The following settings are common:

      • if set to zero then the producer will not wait for any acknowledgment from the server at all. The record will be immediately added to the socket buffer and considered sent. No guarantee can be made that the server has received the record in this case, and the retries configuration will not take effect (as the client won't generally know of any failures). The offset given back for each record will always be set to -1.

      • if set to 1 then this will mean the leader will write the record to its local log but will respond without awaiting full acknowledgement from all followers. In this case should the leader fail immediately after acknowledging the record but before the followers have replicated it then the record will be lost.

      • if set to all then this means the leader will wait for the full set of in-sync replicas to acknowledge the record. This guarantees that the record will not be lost as long as at least one in-sync replica remains alive. This is the strongest available guarantee.

    8. In the Request Timeout field, enter the amount of time the broker will wait trying to meet the RequestRequiredAcks requirement before sending back an error to the client.

    9. In the Number of Retries field, enter the value.
      Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error. Note that this retry is no different than if the client resent the record upon receiving the error. Allowing retries will potentially change the ordering of records because if two records are sent to a single partition, and the first fails and is retried but the second succeeds, then the second record may appear first.

    10. In the Select Project field, select the project

    11. Click Save.

Anchor
Configuring Kafka properties
Configuring Kafka properties
Configuring Kafka properties

To enable proper processing of messages in Kafka, you may need to set the relevant properties. There are two properties that govern the storage and splitting of the Kafka messages. To update the properties, follow the steps given below:

  1. Click Accounts > Settings.
  2. Expand Microservice Settings, and then select Runtime in the left panel.
  3. Expand the property category, Kafka Configuration.
  4. Double-click the corresponding value fields for the property abpm.services.kafka.target.special.characters to change its value.
  5. Click Update.
    You'll see a message saying 'Properties saved successfully'. 

Using SASL security protocol in Kafka

Adeptia Connect supports Simple Authentication and Security Layer (SASL) security protocol in Kafka. SASL security protocol can be used by defining a set of properties in the Addon Configuration field. These properties are set based on the protocol (PLAINTEXT or SSL) that you have selected in the Security Protocol field.

Follow the steps below to use SASL protocol in Kafka:

  1. Select the security protocol in the Security Protocol field based on your Kafka server setting.
    Note: The supported protocols are PLAINTEXT and SSL.
  2. Based on the protocol you have selected, define the SASL properties in the Addon Configuration field as explained in the example below:
    Note: Define each property in a new line.

This is an example in case you have selected PLAINTEXT as a security protocol.

saslMechanism=PLAIN
securityProtocol=SASL_PLAINTEXT
saslJaasConfig= org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="admin-secret";

The table below describes the properties and their values:

Property key

Description

saslMechanism

The SASL mechanism used. For the valid values, click here.

securityProtocol

Protocol used to communicate with brokers. SASL_PLAINTEXT (In case you have selected PLAINTEXT as a security protocol) or SASL_SSL (In case you have selected SSL as a security protocol).

saslJaasConfig

Expose the Kafka sasl.jaas.config parameter, for example, org.apache.kafka.common.security.plain.PlainLoginModule required username=USERNAME password=PASSWORD;

Using Kafka in a process flow

When you use a process flow in a Kafka listener, the File Source service that you have in the process flow starts receiving the messages when the listener is activated. The only condition to be met here is that the Event Context Enabled property for that source service should be enabled while creating the process flow

Note
 If you want to send a message to Kafka, you can use a Kafka target in the process flow.

Perform the following steps to enable the Event Context Enabled property in a process flow:

  1. Select the Source service and then click View Properties option from the context pad menu. 
  2. In the Activity Properties panel, under the GENERAL tab, enable the Event Context Enabled property.
  3. Save the process flow.
  4. Activate the process flow. 

Once you save the process flow with this setting and activate it, you can choose it in the Process Flow Name field while creating a Kafka listener service.

Tip
Ensure that you have activated the Kafka listener service. Click here to know how to activate it. 

Using Kafka in a Template

You can consume and publish a Kafka message using a template. The sections below describe how a Kafka message can be used in a template. 

Consuming Kafka message

To consume a Kafka message in a template, you can use that message as a source in the template. Follow the steps below to achieve this:

  1. Select Kafka as a source application.
  2. In the Which event should trigger the data transfer? field, select New Message and click Next.
  3. In the Provide Kafka account information field, select the Kafka account and click Next.
  4. In the Kafka Settings page, follow the instructions from the step c through step i from the section Creating Kafka Listener.
  5. Click Next to complete the steps that follow to create the template. 

Publishing Kafka message to a Kafka server

To publish a Kafka message in a template, you can use that message as a destination in the template. Follow the steps below to achieve this:

  1. Select Kafka as a destination application.
  2. In the Provide Kafka account information field, select the Kafka account and click Next.
  3. In the How the data should be delivered to destination? field, select Send Message and click Next.
  4. In the Kafka Settings page, follow the instructions from the step c through step i from the section Creating Kafka Target.
  5. Click Next to complete the steps that follow to create the template. 

...

borderStylesolid
titleYou may be interested in...

...