Edge Delta Deduplicate Logs Processor
The Edge Delta deduplicate logs processor removes duplicate logs.
less than a minute
Filter processors control data volume and relevance by removing unnecessary, duplicate, or low-priority events before they continue through the pipeline. These processors support rule-based filtering, deduplication, and intelligent sampling—enabling teams to reduce costs, streamline downstream workloads, and focus on high-value telemetry.
The Edge Delta deduplicate logs processor removes duplicate logs.
The Edge Delta filter processor filters certain data items from the pipeline.
The Edge Delta sample processor takes a representative sample of incoming data items.
The Tail Sample Processor samples incoming traces and spans based on predefined sampling policies.