Processors Overview
less than a minute
Edge Delta processors are modular components that operate on streaming telemetry—logs, metrics, traces, and events—as it flows through your observability pipeline. Each processor performs a focused function such as parsing, filtering, enrichment, metric extraction, or transformation. By combining processors, you can create efficient, cost-aware pipelines tailored to your team’s monitoring, compliance, and performance goals.
Processors are grouped by function to help you design cleaner, more purposeful pipelines:
- Parse Processors: Normalize unstructured fields like logs or payloads into structured data.
- Transform Processors: Modify, enrich, or shape structured telemetry for routing, storage, or analysis.
- Filter Processors: Reduce noise and control telemetry volume using filters, deduplication, or sampling.
- Utility Processors: Add comments, use conditional logic, or run custom OTTL expressions for advanced workflows.
If you’re new to pipelines, start with the Pipeline Quickstart Overview or learn how to Configure a Pipeline.
To learn how to configure a processor, see Configure a Processor.
Looking to understand how processors interact with sources and destinations? Visit the Pipeline Overview or explore Extract and Aggregate Metrics to see real examples.
For optimization strategies, see Best Practices for Edge Delta Processors.
Legacy processor nodes are being deprecated in favor of these stacked sequence processors. See the legacy processors page for details.