Live Capture and In-Stream Debugging

Explore Live Capture and In-Stream Debugging for building and testing pipelines using live data with Edge Delta.

Overview

Effective pipeline design turns on your knowledge of your data. Live capture makes your actual data visible and enables you to preview the expected output of pipeline processors as you build them.

Note: In a cluster with multiple nodes, if you do not have a coordinator pipeline deployed, live capture only shows data from the leader node. With a coordinator pipeline deployed, live capture shows data from all nodes.

Live Capture is designed with enterprise security in mind. The live capture data:

  • Temporary Storage: Data is stored in secure Redis-based cache with automatic expiration:
    • Payload data (captured items): 10 minutes
    • Task status information: 1 hour
  • Secure Transmission: All data is transmitted over TLS-encrypted HTTPS connections between the agent and Edge Delta backend.
  • Transient by Design: Data is not persisted to permanent storage and is automatically removed when the capture window expires.
  • Data Flow: The agent polls for capture tasks every 5 seconds and uploads captured data every 1 second via encrypted HTTPS POST requests to the Edge Delta backend API.

Disabling Live Capture

For highly security-sensitive environments, you can disable live capture entirely by setting the environment variable:

ED_DISABLE_LIVE_CAPTURE=1

When this variable is set, the agent will not collect or transmit any live capture data to the Edge Delta backend. This is useful in environments where:

  • Regulatory requirements prohibit real-time data sampling
  • Network policies restrict outbound data transmission
  • Security policies mandate strict control over data leaving the environment

Note: Disabling live capture will prevent you from using in-stream debugging features and viewing live data samples in the Edge Delta UI.

Live Tail Samples

To view live tail samples, open a processor node in edit mode:

The left pane shows the last 100 data items that were detected passing through the node. You configure your processor in the centre pane, and preview the output in the pane on the right.

You can select a specific data item in the left pane to view its details. The diff for the data item is shown in the right pane. When you select a data item the live capture is paused and the last 100 data items are listed. You click Play to resume capturing the latest data items.

AI-Powered Recommendations

While working in Live Capture, Edge Delta’s Recommendations feature can suggest optimal processors based on your actual data patterns. The multiprocessor view displays recommended processors in the center pane, helping you quickly identify opportunities to filter, sample, or transform your data for better pipeline efficiency. These contextual recommendations are generated by analyzing your live data stream and identifying patterns that could benefit from optimization.

Optimization Rate

A key metric to take note of is the percentage increase of decrease in data size as a result of logic in the centre pane. This is calculated against the 100 items in the left pane.

Affected Indicator

Data items that are modified by a processor are marked with a blue marker in the input and output live capture panes. This is likely to happen when a condition is set in the processor:

Pretty View

You can switch between a code view and a “pretty” view of the data items:

With pretty view enabled you can select a field to view shortcuts for creating processors using that field:

Filters

You can filter each live capture pane to view one specific data type. For example you might want to see logs in the left and resulting metrics in the right.

Note: this changes the view only, it does not apply a filter to the processor.

You can also search the input logs for a keyword. This further filters the input and output panes. For example, of the 100 samples, 23 contain the word error:


Edge Delta Debug Destination Page

View data item samples captured by Debug destination nodes for pipeline troubleshooting.