When you create an event source, you can specify a sink where events are sent to from the source. A sink is an addressable or a callable resource that can receive incoming events from other resources. Knative services, channels and brokers are all examples of sinks.
Addressable objects receive and acknowledge an event delivered over HTTP to an address defined in their
status.address.url field. As a special case, the core Kubernetes
Service object also fulfills the addressable interface.
Callable objects are able to receive an event delivered over HTTP and transform the event, returning
1 new events in the HTTP response. These returned events may be further processed in the same way that events from an external event source are processed.
As a developer, you can create an event sink to receive events from a particular source and send them to a Kafka topic.
You have installed the Red Hat OpenShift Serverless operator, with Knative Serving, Knative Eventing, and Knative Kafka APIs, from the Operator Hub.
You have created a Kafka topic in your Kafka environment.
In the Developer perspective, navigate to the +Add view.
Click Event Sink in the Eventing catalog.
KafkaSink in the catalog items and click it.
Click Create Event Sink.
In the form view, type the URL of the bootstrap server, which is a combination of host name and port.
Type the name of the topic to send event data.
Type the name of the event sink.
In the Developer perspective, navigate to the Topology view.
Click the created event sink to view its details in the right panel.
When you create an event source by using the Knative (
kn) CLI, you can specify a sink where events are sent to from that resource by using the
--sink flag. The sink can be any addressable or callable resource that can receive incoming events from other resources.
The following example creates a sink binding that uses a service,
http://event-display.svc.cluster.local, as the sink:
$ kn source binding create bind-heartbeat \ --namespace sinkbinding-example \ --subject "Job:batch/v1:app=heartbeat-cron" \ --sink http://event-display.svc.cluster.local \ (1) --ce-override "sink=bound"
You can configure which CRs can be used with the
When you create an event source by using the OpenShift Dedicated web console, you can specify a sink that events are sent to from that source. The sink can be any addressable or callable resource that can receive incoming events from other resources.
The OpenShift Serverless Operator, Knative Serving, and Knative Eventing are installed on your OpenShift Dedicated cluster.
You have logged in to the web console and are in the Developer perspective.
You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in OpenShift Dedicated.
You have created a sink, such as a Knative service, channel or broker.
Create an event source of any type, by navigating to +Add → Event Source and selecting the event source type that you want to create.
In the Sink section of the Create Event Source form view, select your sink in the Resource list.
You can verify that the event source was created and is connected to the sink by viewing the Topology page.
In the Developer perspective, navigate to Topology.
View the event source and click the connected sink to see the sink details in the right panel.
You can connect a trigger to a sink, so that events from a broker are filtered before they are sent to the sink. A sink that is connected to a trigger is configured as a
subscriber in the
Trigger object’s resource spec.
Triggerobject connected to a Kafka sink
apiVersion: eventing.knative.dev/v1 kind: Trigger metadata: name: <trigger_name> (1) spec: ... subscriber: ref: apiVersion: eventing.knative.dev/v1alpha1 kind: KafkaSink name: <kafka_sink_name> (2)
|1||The name of the trigger being connected to the sink.|
|2||The name of a