Edge Delta Kafka Output

Stream data to Kafka.

Overview

The Kafka output will stream analytics and insights to your Kafka endpoint.

Example

    - name: kafka
      type: kafka
      endpoint: localhost:2888,localhost:3888 # brokers
      topic: example_kafka_topic
      required_acks: 10
      batch_size: 1000
      batch_bytes: 10000
      batch_timeout: 1m
      async: true
      features: log,metric
      tls:
        disable_verify: true
        ca_file: /var/etc/kafka/ca_file
        ca_path: /var/etc/kafka
        crt_file: /var/etc/kafka/crt_file
        key_file: /var/etc/kafka/keyfile
        key_password: p@ssword123
        client_auth_type: noclientcert 
      sasl:
        username: kafka_username
        password: p@ssword123
        mechanism: PLAIN 

Required Parameters

name

Required

The name parameter specifies a name for the data destination. You refer to this name in other places, for example to refer to a specific destination in a workflow. Names must be unique within the outputs section. It is a yaml list element so it begins with a - and a space followed by the string. A name is required for a data destinations.

outputs:
  streams:
    - name: <data destination name>

type: kafka

Required

The type parameter specifies a vendor or technology for the streaming data destination. It is a closed list element that requires one of the options. See the supported types here{target="_blank"}. A type is required for a streaming data destination.

outputs:
  streams:
    - name: <data destination name>
      type: <destination type>

endpoint

Required

Enter your Kafka broker address.

topic

Required

Enter your Kafka topic name.

Optional Parameters

async

Optional

Enter true or false to enable or disable asynchronous communication between Edge Delta and Kafka.

batch_bytes

Optional

Enter a limit (in bytes) for the maximum size of a request before being sent to a partition.

batch_size

Optional

Enter the maximum number of messages to buffer being being sent to a partition.

The default limit is 100 messages.

batch_timeout

Optional

Enter a time limit for how often incomplete message batches will be flushed to Kafka.

buffer_max_bytesize

Optional

Enter the maximum size of failed streaming data that you want to retry.

If the failed streaming data is larger than this size, then the failed streaming data will not be retried.

buffer_path

Optional

Enter a folder path to temporarily store failed streaming data.

The failed streaming data will be retried until the data reaches its destinations or until the Buffer TTL value is reached.

If you enter a path that does not exist, then the agent will create directories, as needed.

buffer_ttl

Optional

Enter a length of time to retry failed streaming data.

After this length of time is reached, the failed streaming data will no longer be tried.

features

Optional

This parameter defines which data types to stream to the destination.

integration_name

Optional

This parameter refers to the organization-level integration created in the Integrations page.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name parameter. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

required_acks

Optional

Enter the number of acknowledgments that the leader must receive before considering a request to be complete.

To learn more, review this article from Kafka.

sasl: mechanism

Optional

Enter a Kafka SASL mechanism type to implement a secure authentication.

You can enter:

  • PLAIN
  • SCRAM-SHA-256
  • SCRAM-SHA-512

sasl: password

Optional

Enter your Kafka SASL password.

sasl: username

Optional

Enter your Kafka SASL username.

tls: ca_file

Optional

Enter the absolute file path to the CA certificate file.

tls: ca_path

Optional

Enter the absolute path to scan the CA certificate file.

tls: client_auth_type

Optional

Enter a client authorization type.

You can enter:

  • noclientcert
  • requestclientcert
  • requireanyclientcert
  • verifyclientcertifgiven
  • requireandverifyclientcert

The default setting is noclientcert.

tls: crt_file

Optional

Enter the absolute path to the certificate file.

tls: disable_verify

Optional

To disable a TLS verification of a certificate, in the YAML file, enter:

disable_verify:true. To enable a TLS verification of the certificate, in the YAML file, you can enter disable_verify:false or you can remove this line entirely.

tls: key_file

Optional

Enter the absolute path to the private key file.

tls: key_password

Optional

Enter the password for the key file.

tls: max_version

Optional

Enter the maximum version of TLS to accept.

tls: min_version

Optional

Enter the minimum version of TLS to accept.

Supported Features

Supported Features

See Streaming Features.

Feature Type Supported?
Log Yes
Metrics Yes
Alert as event No
Alert as log Yes
Health No
Dimensions as attribute No
Send as is No
Send as JSON No
Custom tags No
EDAC enrichment No
Message template No
outgoing_bytes.sum Yes
outgoing__raw_bytes.sum No
outgoing_lines.count Yes
output buffering to disk No