Kafka connections
This page shows all existing Kafka connections. Via the context menu, you can edit and delete these or create new ones. Lobster Integration always acts as a Kafka client and not as a server and currently only uses the Kafka message function (no streaming).
Creating a connection
(1) Alias: You can use this alias (as name of the connection) in the Input Agent "Kafka" and Response "Kafka".
(2) Active: Messages can only be received and sent via active connections.
(3) Kafka URLs: Usually, several hosts are defined here (separated with commas or semicolons), since Kafka is usually operated in a cluster. Example: "kafka.server1.example.com:9092,kafka.server2.example.com:9092,kafka.server3.example.com:9092".
(4) Kafka properties: Additional connection properties can be defined via the context menu. Important note: Please do not set the property "group.id" here. If necessary, you can do this in the Input Agent "Kafka".
(5) Check connection: You can test the connection here. The currently set (and possibly not yet saved) values are used. To apply these values, they must be saved.
OAuth2
Kafka supports OAuth2 utilizing a third party authorization server (e.g. Keycloak). A client callback handler to fetch the access token has to be specified and configured. The following callback handler supports Client Credentials only. It takes up to four arguments.
oauth.token.endpoint.uri |
Specifies the token endpoint. |
oauth.client.id |
Client-id/username/principal. |
oauth.client.secret |
The password/secret. |
oauth.fetch.token.via.dmz |
(optional) "true" if the HTTPS request to fetch the token shall utilise the DMZ server. Default: "false". |
All the above are part of the sasl.jaas.property in the Kafka alias settings (see example below). The login module for this process is always org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule and the flag is always 'required'. Parameters for the above callback handler follow the 'required' keyword in key="value" pairs (the "" are mandatory).
The callback handler has to be made known to Kafka via the property sasl.login.callback.handler.class, the full class name of the callback handler is com.ebd.hub.datawizard.kafka.oauth.OAuth2ClientAuthCallbackHandler. The properties sasl.mechanism with value OAUTHBEARER and security.protocol with value SASL_SSL are also mandatory.
Example Kafka properties configuration
sasl.jaas.config = org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required oauth.token.endpoint.uri="https://example.com:8443/oauth/token" oauth.client.id="testuser" oauth.client.secret="myprecioussecret" oauth.fetch.token.via.dmz="false"; sasl.login.callback.handler.class = com.ebd.hub.datawizard.kafka.oauth.OAuth2ClientAuthCallbackHandler sasl.mechanism = OAUTHBEARER security.protocol = SASL_SSL |