Configuration Overview
Design and build effective pipelines by testing.
4 minute read
See Quickstart: Basic Pipeline Configuration.
This section describes the Pipelines Interface.
The configuration structure is a graph consisting of three types of nodes, the links between them, and the data handled by the pipeline:
The pipeline builder opens with Edit Mode off. Turn it on to build a pipeline.
You can filters by name the Sources, Processors and Destinations to view only the connections between the selected objects. Note this changes the view only, not the pipeline configuration.
For example, you can view the pipeline with only kubernetes_logs
, k8s_traffic_metrics
, k8s_trace_input
, and k8s_metrics
nodes and their connections to only the edgedelta
destination:
The pipeline builder lists the outgoing data rate for each node. Using this view you can quickly see how the configuration contributes to the overall pipeline efficiency.
The link weight indicates the relative volume of traffic on link between two nodes.
For example, the link from the kubernetes_logs
node carrying 6GB
of traffic is rendered thicker than the 896MB
of traffic flowing on the link from the k8s_trace_input
node.
Nodes are configuration objects. Each node performs a function and you design a pipeline by creating a flow of traffic through a set of nodes. Select a node to view view its paths or double click to view details about it.
You can toggle between viewing traffic metrics as events or bytes and view a traffic graph for the node. Matched events indicates the input to successful output ratio in event numbers. Bear in mind that nodes that generate events might have a value of more than 100%. Nodes that fail to process traffic due to a configuration error will have a low matching rate.
The details pane also shows the configuration part of the node’s YAML.
Click the kebab icon to view settings for the pipeline.
Cloud pipelines have a Settings option. See Edit Cloud Pipeline Resources.
The Throughput tab on the Pipeline view shows the traffic statistics for that particular pipeline only. Similarly to the Pipelines overview page you can view Bytes in and out as well as Events in and out.
You can filter the graphs by service, node type and name or data type:
If any packs are used in your pipeline they are listed on the Packs tab. Here you can determine if they are up to date with the latest pack version.
On the Resource Usage tab you can view graphs for the CPU (agent_cpu_millicores.value
and ed.agent.cpu.milicores
metrics) and memory usage (agent_mem_alloc.value
and ed.agent.memory.allocation
metrics) of the Edge Delta agents belonging to this pipeline.
You can filter the graphs by Host:
On the Agents tab the Edge Delta agents belonging to the pipeline are listed on the Host Name table.
Click Upgrade Agents to view the upgrade commands for the latest version of the agents, and in the environment you selected when you created the pipeline.
Click an agent to view its details for troubleshooting an Edge Delta installation. It opens on the Health tab for the agent:
Here you can examine the health metrics for the agent internal components.
On the Logs tab you can view internal logs generated by that agent:
The metrics tab shows resource usage for the agent, similar to viewing pipeline resource usage filtered by Host Name.
On the Profiling tab you can view the agent’s performance profile. Select a dimension to view the flame graph. See Performance Profiling Edge Delta Agents for more information.
Back on the Pipeline view page, the Logs tab shows logs generated by Edge Delta agents. You can filter them by Severity:
Design and build effective pipelines by testing.
How to configure Edge Delta pipelines.
Build and test pipelines using your live data.
How to configure Edge Delta data pipeline processors.
Packs in the Edge Delta Visual Pipeline.
Implement circuit breaker protections for output nodes to prevent cascading failures and protect telemetry delivery pipelines.
Edge Delta Knowledge Libraries.
Global Pipeline configuration options.
Configure Edge Delta Global Data Settings.
Use CEL Custom Macros to reference log fields.