Filter Processors

Filter processors reduce noise and control data volume by filtering, deduplicating, and sampling telemetry before downstream processing.

Filter processors control data volume and relevance by removing unnecessary, duplicate, or low-priority events before they continue through the pipeline. These processors support rule-based filtering, deduplication, and intelligent sampling—enabling teams to reduce costs, streamline downstream workloads, and focus on high-value telemetry.


Edge Delta Deduplicate Logs Processor

The Edge Delta deduplicate logs processor removes duplicate logs.

Edge Delta Filter Processor

The Edge Delta filter processor filters certain data items from the pipeline.

Edge Delta Sample Processor

The Edge Delta sample processor takes a representative sample of incoming data items.

Edge Delta Tail Sample Processor

The Tail Sample Processor samples incoming traces and spans based on predefined sampling policies.