azure storage datalake source Azure Storage Blob Data Lake Source

Provided by: "Apache Software Foundation"

Support Level for this Kamelet is: "Stable"

Consume files from Azure Storage Blob Data Lake.

Configuration Options

The following table summarizes the configuration options available for the azure-storage-datalake-source Kamelet:

Property Name Description Type Default Example

accountName

Account Name

Required The Azure Storage Blob Data lake account name.

string

clientId

Client Id

Required The Azure Storage Blob Data lake client Id.

string

clientSecret

Client Secret

Required The Azure Storage Blob Data lake client secret.

string

fileSystemName

File System Name

Required The Azure Storage Blob Data lake File system name.

string

tenantId

Tenant Id

Required The Azure Storage Blob Data lake tenant id.

string

credentialType

Credential Type

Determines the credential strategy to adopt.

Enum values:

* CLIENT_SECRET * SHARED_KEY_CREDENTIAL * AZURE_IDENTITY * AZURE_SAS * SERVICE_CLIENT_INSTANCE

string

CLIENT_SECRET

delay

Delay

The number of milliseconds before the next poll of the selected blob.

integer

500

Dependencies

At runtime, the azure-storage-datalake-source Kamelet relies upon the presence of the following dependencies:

  • camel:azure-storage-datalake

  • camel:kamelet

  • camel:core

  • camel:timer

  • mvn:org.apache.camel.kamelets:camel-kamelets-utils:4.6.0-SNAPSHOT

Camel JBang usage

Prerequisites

  • You’ve installed JBang.

  • You have executed the following command:

jbang app install camel@apache/camel

Supposing you have a file named route.yaml with this content:

- route:
    from:
      uri: "kamelet:timer-source"
      parameters:
        period: 10000
        message: 'test'
      steps:
        - to:
            uri: "kamelet:log-sink"

You can now run it directly through the following command

camel run route.yaml

Camel K Environment Usage

This section describes how you can use the azure-storage-datalake-source.

Knative source

You can use the azure-storage-datalake-source Kamelet as a Knative source by binding it to a Knative object.

azure-storage-datalake-source-pipe.yaml
apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
  name: azure-storage-datalake-source-pipe
spec:
  source:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1
      name: azure-storage-datalake-source
    properties:
      accountName: The Account Name
      clientId: The Client Id
      clientSecret: The Client Secret
      fileSystemName: The File System Name
      tenantId: The Tenant Id
  sink:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel

Prerequisite

You have Camel K installed on the cluster.

Procedure for using the cluster CLI

  1. Save the azure-storage-datalake-source-pipe.yaml file to your local drive, and then edit it as needed for your configuration.

  2. Run the source by using the following command:

    kubectl apply -f azure-storage-datalake-source-pipe.yaml

Procedure for using the Kamel CLI

Configure and run the source by using the following command:

kamel bind channel:mychannel -p "source.accountName=The Account Name" -p "source.clientId=The Client Id" -p "source.clientSecret=The Client Secret" -p "source.fileSystemName=The File System Name" -p "source.tenantId=The Tenant Id" azure-storage-datalake-source

This command creates the Kamelet Pipe in the current namespace on the cluster.

Kafka source

You can use the azure-storage-datalake-source Kamelet as a Kafka source by binding it to a Kafka topic.

azure-storage-datalake-source-pipe.yaml
apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
  name: azure-storage-datalake-source-pipe
spec:
  source:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1
      name: azure-storage-datalake-source
    properties:
      accountName: The Account Name
      clientId: The Client Id
      clientSecret: The Client Secret
      fileSystemName: The File System Name
      tenantId: The Tenant Id
  sink:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic

Prerequisites

  • You’ve installed Strimzi.

  • You’ve created a topic named my-topic in the current namespace.

  • You have Camel K installed on the cluster.

Procedure for using the cluster CLI

  1. Save the azure-storage-datalake-source-pipe.yaml file to your local drive, and then edit it as needed for your configuration.

  2. Run the source by using the following command:

    kubectl apply -f azure-storage-datalake-source-pipe.yaml

Procedure for using the Kamel CLI

Configure and run the source by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic -p "source.accountName=The Account Name" -p "source.clientId=The Client Id" -p "source.clientSecret=The Client Secret" -p "source.fileSystemName=The File System Name" -p "source.tenantId=The Tenant Id" azure-storage-datalake-source

This command creates the Kamelet Pipe in the current namespace on the cluster.