azure storage datalake sink Azure Storage Blob Data Lake Sink

Provided by: "Apache Software Foundation"

Support Level for this Kamelet is: "Stable"

Send data to Azure Storage Blob Data Lake.

Configuration Options

The following table summarizes the configuration options available for the azure-storage-datalake-sink Kamelet:

Property Name Description Type Default Example

accountName

Account Name

Required The Azure Storage Blob Data lake account name.

string

clientId

Client Id

Required The Azure Storage Blob Data lake client Id.

string

clientSecret

Client Secret

Required The Azure Storage Blob Data lake client secret.

string

fileSystemName

File System Name

Required The Azure Storage Blob Data lake File system name.

string

tenantId

Tenant Id

Required The Azure Storage Blob Data lake tenant id.

string

credentialType

Credential Type

Determines the credential strategy to adopt.

Enum values:

* CLIENT_SECRET * SHARED_KEY_CREDENTIAL * AZURE_IDENTITY * AZURE_SAS * SERVICE_CLIENT_INSTANCE

string

CLIENT_SECRET

Dependencies

At runtime, the azure-storage-datalake-sink Kamelet relies upon the presence of the following dependencies:

  • camel:azure-storage-datalake

  • camel:kamelet

  • camel:core

  • camel:timer

  • mvn:org.apache.camel.kamelets:camel-kamelets-utils:4.6.0-SNAPSHOT

Camel JBang usage

Prerequisites

  • You’ve installed JBang.

  • You have executed the following command:

jbang app install camel@apache/camel

Supposing you have a file named route.yaml with this content:

- route:
    from:
      uri: "kamelet:timer-source"
      parameters:
        period: 10000
        message: 'test'
      steps:
        - to:
            uri: "kamelet:log-sink"

You can now run it directly through the following command

camel run route.yaml

Camel K Environment Usage

This section describes how you can use the azure-storage-datalake-sink.

Knative sink

You can use the azure-storage-datalake-sink Kamelet as a Knative sink by binding it to a Knative object.

azure-storage-datalake-sink-pipe.yaml
apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
  name: azure-storage-datalake-sink-pipe
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1
      name: azure-storage-datalake-sink
    properties:
      accountName: The Account Name
      clientId: The Client Id
      clientSecret: The Client Secret
      fileSystemName: The File System Name
      tenantId: The Tenant Id

Prerequisite

You have Camel K installed on the cluster.

Procedure for using the cluster CLI

  1. Save the azure-storage-datalake-sink-pipe.yaml file to your local drive, and then edit it as needed for your configuration.

  2. Run the sink by using the following command:

    kubectl apply -f azure-storage-datalake-sink-pipe.yaml

Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind channel:mychannel -p "sink.accountName=The Account Name" -p "sink.clientId=The Client Id" -p "sink.clientSecret=The Client Secret" -p "sink.fileSystemName=The File System Name" -p "sink.tenantId=The Tenant Id" azure-storage-datalake-sink

This command creates the Kamelet Pipe in the current namespace on the cluster.

Kafka sink

You can use the azure-storage-datalake-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

azure-storage-datalake-sink-pipe.yaml
apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
  name: azure-storage-datalake-sink-pipe
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1
      name: azure-storage-datalake-sink
    properties:
      accountName: The Account Name
      clientId: The Client Id
      clientSecret: The Client Secret
      fileSystemName: The File System Name
      tenantId: The Tenant Id

Prerequisites

  • You’ve installed Strimzi.

  • You’ve created a topic named my-topic in the current namespace.

  • You have Camel K installed on the cluster.

Procedure for using the cluster CLI

  1. Save the azure-storage-datalake-sink-pipe.yaml file to your local drive, and then edit it as needed for your configuration.

  2. Run the sink by using the following command:

    kubectl apply -f azure-storage-datalake-sink-pipe.yaml

Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic -p "sink.accountName=The Account Name" -p "sink.clientId=The Client Id" -p "sink.clientSecret=The Client Secret" -p "sink.fileSystemName=The File System Name" -p "sink.tenantId=The Tenant Id" azure-storage-datalake-sink

This command creates the Kamelet Pipe in the current namespace on the cluster.