Apache Kafka on OpenShift Serverless is a Technology Preview feature only. Technology Preview features are not supported with Red Hat production service level agreements (SLAs) and might not be functionally complete. Red Hat does not recommend using them in production. These features provide early access to upcoming product features, enabling customers to test functionality and provide feedback during the development process.

For more information about the support scope of Red Hat Technology Preview features, see https://access.redhat.com/support/offerings/techpreview/.

You can use the KafkaChannel channel type and KafkaSource event source with OpenShift Serverless. To do this, you must install the Knative Kafka components, and configure the integration between OpenShift Serverless and a supported Red Hat AMQ Streams cluster.

The OpenShift Serverless Operator provides the Knative Kafka API that can be used to create a KnativeKafka custom resource:

Example KnativeKafka custom resource
apiVersion: operator.serverless.openshift.io/v1alpha1
kind: KnativeKafka
metadata:
    name: knative-kafka
    namespace: knative-eventing
spec:
    channel:
        enabled: true (1)
        bootstrapServers: <bootstrap_server> (2)
    source:
        enabled: true (3)
1 Enables developers to use the KafkaChannel channel type in the cluster.
2 A comma-separated list of bootstrap servers from your AMQ Streams cluster.
3 Enables developers to use the KafkaSource event source type in the cluster.

Installing Apache Kafka components using the web console

Cluster administrators can enable the use of Apache Kafka functionality in an OpenShift Serverless deployment by instantiating the KnativeKafka custom resource definition provided by the Knative Kafka OpenShift Serverless Operator API.

Prerequisites
  • You have installed OpenShift Serverless, including Knative Eventing, in your OpenShift Container Platform cluster.

  • You have access to a Red Hat AMQ Streams cluster.

  • You have cluster administrator permissions on OpenShift Container Platform.

  • You are logged in to the web console.

Procedure
  1. In the Administrator perspective, navigate to OperatorsInstalled Operators.

  2. Check that the Project dropdown at the top of the page is set to Project: knative-eventing.

  3. Click Knative Kafka in the list of Provided APIs for the OpenShift Serverless Operator to go to the Knative Kafka tab.

  4. Click Create Knative Kafka.

  5. Optional: Configure the KnativeKafka object in the Create Knative Kafka page. To do so, use either the default form provided or edit the YAML.

    1. Using the form is recommended for simpler configurations that do not require full control of KnativeKafka object creation.

    2. Editing the YAML is recommended for more complex configurations that require full control of KnativeKafka object creation. You can access the YAML by clicking the Edit YAML link in the top right of the Create Knative Kafka page.

  6. Click Create after you have completed any of the optional configurations for Kafka. You are automatically directed to the Knative Kafka tab where knative-kafka is in the list of resources.

Verification steps
  1. Click on the knative-kafka resource in the Knative Kafka tab. You are automatically directed to the Knative Kafka Overview page.

  2. View the list of Conditions for the resource and confirm that they have a status of True.