×

Knative Kafka provides integration options for you to use supported versions of the Apache Kafka message streaming platform with OpenShift Serverless. Kafka provides options for event source, channel, broker, and event sink capabilities.

Knative Kafka functionality is available in an OpenShift Serverless installation if a cluster administrator has installed the KnativeKafka custom resource.

Knative Kafka is not currently supported for IBM Z and IBM Power.

Knative Kafka provides additional options, such as:

  • Kafka source

  • Kafka channel

  • Kafka broker (Technology Preview)

  • Kafka sink (Technology Preview)

Kafka event delivery and retries

Using Kafka components in an event-driven architecture provides "at least once" event delivery. This means that operations are retried until a return code value is received. This makes applications more resilient to lost events; however, it might result in duplicate events being sent.

For the Kafka event source, there is a fixed number of retries for event delivery by default. For Kafka channels, retries are only performed if they are configured in the Kafka channel Delivery spec.

See the Event delivery documentation for more information about delivery guarantees.

Kafka source

You can create a Kafka source that reads events from an Apache Kafka cluster and passes these events to a sink. You can create a Kafka source by using the OpenShift Container Platform web console, the Knative (kn) CLI, or by creating a KafkaSource object directly as a YAML file and using the OpenShift (oc) CLI to apply it.

Creating a Kafka event source by using the web console

After Knative Kafka is installed on your cluster, you can create a Kafka source by using the web console. Using the OpenShift Container Platform web console provides a streamlined and intuitive user interface to create a Kafka source.

Prerequisites
  • The OpenShift Serverless Operator, Knative Eventing, and the KnativeKafka custom resource are installed on your cluster.

  • You have logged in to the web console.

  • You have access to a Red Hat AMQ Streams (Kafka) cluster that produces the Kafka messages you want to import.

  • You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in OpenShift Container Platform.

Procedure
  1. In the Developer perspective, navigate to the +Add page and select Event Source.

  2. In the Event Sources page, select Kafka Source in the Type section.

  3. Configure the Kafka Source settings:

    1. Add a comma-separated list of Bootstrap Servers.

    2. Add a comma-separated list of Topics.

    3. Add a Consumer Group.

    4. Select the Service Account Name for the service account that you created.

    5. Select the Sink for the event source. A Sink can be either a Resource, such as a channel, broker, or service, or a URI.

    6. Enter a Name for the Kafka event source.

  4. Click Create.

Verification

You can verify that the Kafka event source was created and is connected to the sink by viewing the Topology page.

  1. In the Developer perspective, navigate to Topology.

  2. View the Kafka event source and sink.