Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Apache Kafka is a distributed streaming platform that is highly scalable and secure, and it can:

  • Consume and publish messages/event streams, similar to an enterprise messaging system.
  • Store the messages for as long as you want.
  • Process the messages/event streams as they occur.

To connect with Kafka, you need to create a Kafka account and define a Kafka listener or a Kafka target. Kafka messages can be consumed and published by a template, transaction, or process flow. 

This page contains the following information:

Creating Kafka Account

Kafka account is used to connect with Kafka Server. While creating Kafka account, you need to define the following properties which are used to connect to Kafka Server. 

  1. Click Configure > ACCOUNTS > Kafka.  

  2. Click Create Kafka account
  3. In Create Kafka Account window, do the following:



    1. In the Name and Description fields, enter the name and description respectively for the new Kafka account.
    2. In the Broker field, enter the URL of the Kafka brokers, foe example, host1:port1,host2:port2.
    3. In the Security Protocol field, enter the protocol used to communicate with brokers.

      The supported protocols are PLAINTEXT and SSL. 

      For SSL:

      • In case you are using SSL for client authentication, select SSL in the Security Protocol field and then select a Keystore in the Select Keystore field.

      • In case you are using SSL for server authentication, you must import the Kafka server certificate in Adeptia Connect Truststore. To import the certificate, click here.

    4. In the Addon Configuration window, enter the Add On Configurations to be used in back end for performing operation on Kafka server (e.g. param1=value,param2=value..).

      Here you can use properties which are not listed on this Kafka account creation interface, for example, you may need to use a property called Serializer that converts the data to bytes. You can find the properties available at this location.
    5. In the Select Project field, select the project.
    6. Click Save.

Configuring Kafka properties

To enable proper processing of messages in Kafka, you may need to set the relevant properties. There are two properties that govern the storage and splitting of the Kafka messages. To update the properties, follow the steps given below:

  1. Click Account > Settings.
  2. Expand the Server Node Settings in the left panel.
  3. Select the server node.
  4. Click Edit.
  5. Click  to expand Kafka Configuration.
  6. Edit the following properties.
    1. abpm.services.kafka.messages.location
      Enter the location where you want to save the Kafka messages.

      In case of cluster environment the property abpm.services.kafka.messages.location needs to be updated to be able to read data from Kafka. The property value has to be the absolute path of shared drive till Attachments folder inside shared drive.


    2. abpm.services.kafka.target.special.characters
      Enter the set of special characters based on which the message will be split into separate records in Kafka.
  7. Click Save.

Using SASL security protocol in Kafka

Adeptia Connect supports Simple Authentication and Security Layer (SASL) security protocol in Kafka. SASL security protocol can be used by defining a set of properties in the Addon Configuration field. These properties are set based on the protocol (PLAINTEXT or SSL) that you have selected in the Security Protocol field.

Follow the steps below to use SASL protocol in Kafka:

  1. Select the security protocol in the Security Protocol field based on your Kafka server setting.
    Note: The supported protocols are PLAINTEXT and SSL.
  2. Based on the protocol you have selected, define the SASL properties in the Addon Configuration field as explained in the example below:
    Note: Define each property in a new line.

This is an example in case you have selected PLAINTEXT as a security protocol.

saslMechanism=PLAIN
securityProtocol=SASL_PLAINTEXT
saslJaasConfig= org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="admin-secret";

The table below describes the properties and their values:

Property key

Description

saslMechanism

The SASL mechanism used. For the valid values, click here.

securityProtocol

Protocol used to communicate with brokers. SASL_PLAINTEXT (In case you have selected PLAINTEXT as a security protocol) or SASL_SSL (In case you have selected SSL as a security protocol).

saslJaasConfig

Expose the Kafka sasl.jaas.config parameter, for example, org.apache.kafka.common.security.plain.PlainLoginModule required username=USERNAME password=PASSWORD;

Using Kafka in a process flow

When you use Kafka listener with a process flow, the File Source service that you have in the process flow starts receiving the messages when the listener is activated. The only condition to be met here is that the Event Context Enabled property for that source service should be enabled while creating the process flow

 If you want to send a message to Kafka, you can use a Kafka target in the process flow.

Perform the following steps to enable the Event Context Enabled property in a process flow:

  1. Select the Source service and then click View Properties option from the context pad menu. 
  2. In the Activity Properties panel, under the GENERAL tab, enable the Event Context Enabled property.
  3. Save the process flow.
  4. Activate the process flow. 

Once you save the process flow with this setting and activate it, you can choose it in the Process Flow Name field while creating a Kafka listener service.

Ensure that you have activated the Kafka listener service. Click here to know how to activate it. 

Using Kafka in a Template

You can consume and publish a Kafka message using a template. The sections below describe how a Kafka message can be used in a template. 

Consuming Kafka message

To consume a Kafka message in a template, you can use that message as a source in the template. Follow the steps below to achieve this:

  1. Select Kafka as a source application.
  2. In the Which event should trigger the data transfer? field, select New Message and click Next.
  3. In the Provide Kafka account information field, select the Kafka account and click Next.
  4. In the Kafka Settings page, follow the instructions from the step c through step l from the section Creating Kafka Listener.
  5. Click Next to complete the steps that follow to create the template. 

Publishing Kafka message to a Kafka server

To publish a Kafka message in a template, you can use that message as a destination in the template. Follow the steps below to achieve this:

  1. Select Kafka as a destination application.
  2. In the Provide Kafka account information field, select the Kafka account and click Next.
  3. In the How the data should be delivered to destination? field, select Send Message and click Next.
  4. In the Kafka Settings page, follow the instructions from the step c through step i from the section Creating Kafka Target.
  5. Click Next to complete the steps that follow to create the template. 

  • If you want to receive email notifications in case of Kafka Listener failure, enter the email id(s) separated by commas in the Contact User field on the Settings page while creating the Template.
  • If you want the Kafka Listener to be deactivated after its failure for a specified number of retries, you need to set the value for the property abpm.event.abort.retryCount available in the server-configure.properties file to that specific number. For example, if you want the Kafka Listener to be deactivated after thirteen (13) retries, set the value for the property abpm.event.abort.retryCount to 13. The user gets an additional notification about the deactivation of the Kafka Listener on the email id set in the Contact User field on the Settings page of the Template.
  • No labels