Edge Delta Kafka Destination
Send logs to Kafka.
4 minute read
Overview
The Kafka destination node delivers data to Kafka topics. It supports multiple Kafka brokers, flexible batch processing, and secure communications.
- incoming_data_types: cluster_pattern_and_sample, log, metric, signal, custom
See Send Data to Kafka for more information.
This node requires Edge Delta agent version v0.1.96 or higher.
Example Configuration
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
Required Parameters
name
A descriptive name for the node. This is the name that will appear in pipeline builder and you can reference this node in the YAML using the name. It must be unique across all nodes. It is a YAML list element so it begins with a - and a space followed by the string. It is a required parameter for all nodes.
nodes:
- name: <node name>
type: <node type>
type: kafka_output
The type parameter specifies the type of node being configured. It is specified as a string from a closed list of node types. It is a required parameter.
nodes:
- name: <node name>
type: <node type>
endpoint
The endpoint parameter is used to specify the Kafka broker endpoints where data will be sent. They can be a comma separated list. It is a required parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
topic
The topic parameter designates the Kafka topic to which the data will be sent. It is a required parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
Optional Parameters
required_acks
The required_acks parameter determines how many acknowledgements the leader broker must receive before a record batch is considered sent. This controls the durability of records that are sent. It is an optional parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
required_acks: 1
batch_size
The batch_size parameter controls the maximum number of messages to batch before sending to the topic. This helps performance on both the client and the server. It is an optional parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
batch_size: 100
batch_bytes
The batch_bytes parameter sets a maximum size limit for message batches in bytes. This helps performance on both the client and the server. It is an optional parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
batch_bytes: 1048576
batch_timeout
The batch_timeout parameter allows you to specify how often incomplete message batches are flushed to Kafka. It is an optional parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
batch_timeout: "5s"
async
The async parameter toggles between synchronous and asynchronous modes of sending messages. You can specify true or false. If set to true, communication between Edge Delta agents and Kafka will be asynchronous. It is an optional parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
async: true
max_message_size
The max_message_size parameter specifies the maximum size of a message that can be sent. It is an optional parameter.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
max_message_size: "2048"
tls
The tls configuration block enables secure communication with Kafka brokers through SSL/TLS settings. The following options can be set within this block:
ignore_certificate_checkDisables SSL/TLS certificate verification for secure connections. Use with caution.ca_fileSpecifies the absolute file path to the CA certificate for SSL/TLS connections.ca_pathDefines the absolute path where CA certificate files are located for SSL/TLS.crt_fileProvides the absolute path to the SSL/TLS certificate file for secure communication.key_fileIndicates the absolute path to the private key file used in SSL/TLS connections.key_passwordOptional password for the key file.client_auth_typeDefines the client authentication type required by the Kafka brokers:noclientcert(default),requestclientcert,requireanyclientcert,verifyclientcertifgiven, orrequireandverifyclientcert.min_versionSets the minimum version of TLS protocol that is acceptable:TLSv1_0,TLSv1_1,TLSv1_2(default), orTLSv1_3.max_versionSets the maximum version of TLS protocol that is acceptable:TLSv1_0,TLSv1_1,TLSv1_2, orTLSv1_3.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
tls:
ignore_certificate_check: false
ca_file: "/etc/edgedelta/ca.pem"
crt_file: "/etc/edgedelta/cert.pem"
key_file: "/etc/edgedelta/key.pem"
client_auth_type: "requestclientcert"
min_version: "TLSv1_1"
max_version: "TLSv1_3"
sasl
The sasl block defines the authentication details necessary for SASL authentication with Kafka brokers. You can configure the following options:
username- The username for SASL authentication.password- The password for SASL authentication.mechanism- The mechanism to use for SASL authentication. Valid options areplain,scram-sha-256, orscram-sha-512.This field supports secret references for secure credential management. Instead of hardcoding sensitive values, you can reference a secret configured in your pipeline.
To use a secret in the GUI:
- Create a secret in your pipeline’s Settings > Secrets section (see Using Secrets)
- In this field, select the secret name from the dropdown list that appears
To use a secret in YAML:
Reference it using the syntax: '{{ SECRET secret-name }}'
Example:
field_name: '{{ SECRET my-credential }}'
Note: The secret reference must be enclosed in single quotes when using YAML. Secret values are encrypted at rest and resolved at runtime, ensuring no plaintext credentials appear in logs or API responses.
- name: my_kafka_output
type: kafka_output
endpoint: localhost:2888,localhost:3888
topic: example_kafka_topic
sasl:
username: "edgedelta-user"
password: "edgedelta-pass"
mechanism: "scram-sha-512"