kafka sink Kafka Sink

Provided by: "Apache Software Foundation"

Support Level for this Kamelet is: "Preview"

Send data to Kafka topics.

The Kamelet is able to understand the following headers to be set:

  • key / ce-key: as message key

  • partition-key / ce-partition-key: as message partition key

Both the headers are optional.

Configuration Options

The following table summarizes the configuration options available for the kafka-sink Kamelet:

Property Name Description Type Default Example

brokers *

Brokers

Comma separated list of Kafka Broker URLs

string

password *

Password

Password to authenticate to kafka

string

topic *

Topic Names

Comma separated list of Kafka topic names

string

username *

Username

Username to authenticate to Kafka

string

saslMechanism

SASL Mechanism

The Simple Authentication and Security Layer (SASL) Mechanism used.

string

"PLAIN"

securityProtocol

Security Protocol

Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported

string

"SASL_SSL"

Fields marked with (*) are mandatory.

Usage

This section summarizes how the kafka-sink can be used in various contexts.

Knative Sink

The kafka-sink Kamelet can be used as Knative sink by binding it to a Knative object.

kafka-sink-binding.yaml
apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: kafka-sink-binding
spec:
  source:
    ref:
      kind: InMemoryChannel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: kafka-sink
    properties:
      brokers: "The Brokers"
      password: "The Password"
      topic: "The Topic Names"
      username: "The Username"

Make sure you have Camel K installed into the Kubernetes cluster you’re connected to.

Save the kafka-sink-binding.yaml file into your hard drive, then configure it according to your needs.

You can run the sink using the following command:

kubectl apply -f kafka-sink-binding.yaml