Send Metrics to a Custom API
6 minute read
Overview
This guide demonstrates how to send EdgeDelta metrics to custom APIs, internal dashboards, or any HTTP endpoint that accepts JSON payloads. This approach enables integration with internal monitoring systems, custom-built dashboards, or third-party services that don’t have native EdgeDelta integrations.
Prerequisites
- Target API Endpoint: An HTTP endpoint that accepts POST requests with JSON payloads
- API Authentication: Credentials (API key, token, or basic auth) if required
- EdgeDelta Pipeline: A metric-producing node (system stats, extract metric, etc.)
Pipeline Flow
Configuration
Step 1: Identify Your API Requirements
Before configuring the webhook, understand your target API’s requirements:
- Endpoint URL: The full URL to send metrics
- HTTP Method: Usually POST for metrics ingestion
- Authentication: API key, bearer token, or basic auth
- Payload Format: Required JSON structure
Step 2: Store Credentials
Store API credentials as environment variables:
export DASHBOARD_API_KEY="your-api-key"
export DASHBOARD_ENDPOINT="https://api.your-dashboard.com/v1/metrics"
Step 3: Configure the Webhook Node
Create a webhook node with a payload matching your API requirements:
- name: metrics_dashboard
type: webhook_output
endpoint: https://api.custom-dashboard.com/v1/metrics
headers:
- header: Content-Type
value: "application/json"
- header: X-API-Key
value: "${DASHBOARD_API_KEY}"
payload: |
{
"metric_name": "{{ .item.name }}",
"type": "{{ .item._stat_type }}",
{{ if eq .item._stat_type "value" }}"value": {{ .item.gauge.value }},{{ end }}
{{ if eq .item._stat_type "sum" }}"value": {{ .item.sum.value.sum }},{{ end }}
{{ if eq .item._stat_type "histogram" }}"histogram": {
"sum": {{ .item.histogram.sum }},
"count": {{ .item.histogram.count }},
"min": {{ .item.histogram.min }},
"max": {{ .item.histogram.max }}
},{{ end }}
"timestamp": "{{ .item.timestamp }}",
"tags": {
"host": "{{ index .item.resource \"host.name\" }}",
"source": "{{ index .item.resource \"ed.source_name\" }}",
"source_type": "{{ index .item.resource \"ed.source_type\" }}",
"environment": "{{ index .item.attributes \"environment\" | default \"production\" }}"
},
"metadata": {
"stat_type": "{{ .item._stat_type }}",
"agent_tag": "{{ index .item.resource \"ed.tag\" }}"
}
}
Step 4: Connect to a Metric Source
nodes:
- name: system_metrics
type: ed_system_stats_input
- name: metrics_dashboard
type: webhook_output
# ... configuration from above ...
links:
- from: system_metrics
to: metrics_dashboard
Metric Data Reference
Available Metric Fields
| Field | Description | Example |
|---|---|---|
.item.name | Metric name | system.cpu.usage |
.item._stat_type | Internal stat type | value, sum, histogram |
.item.kind | Metric kind | gauge, sum, histogram |
.item.unit | Metric unit | 1, By, s |
.item.description | Metric description | CPU usage percentage |
.item.timestamp | Unix timestamp (ms) | 1769468902176 |
.item.resource | Resource attributes | Host, namespace, etc. |
.item.attributes | Metric attributes | Custom labels |
Value Access by Metric Type
Gauge Metrics (_stat_type: "value"):
{{ .item.gauge.value }}
Sum Metrics (_stat_type: "sum"):
{{ .item.sum.value.sum }}
{{ .item.sum.is_monotonic }}
Histogram Metrics (_stat_type: "histogram"):
{{ .item.histogram.sum }}
{{ .item.histogram.count }}
{{ .item.histogram.min }}
{{ .item.histogram.max }}
{{ .item.histogram.counts }} // Bucket counts array
Common API Payload Formats
Datadog-Style Format
payload: |
{
"series": [
{
"metric": "{{ .item.name }}",
"points": [[{{ .item.timestamp }}, {{ if eq .item._stat_type "value" }}{{ .item.gauge.value }}{{ else }}0{{ end }}]],
"type": "gauge",
"host": "{{ index .item.resource \"host.name\" }}",
"tags": [
"source:{{ index .item.resource \"ed.source_name\" }}",
"env:{{ index .item.attributes \"environment\" | default \"production\" }}"
]
}
]
}
InfluxDB Line Protocol (JSON Wrapper)
payload: |
{
"measurement": "{{ .item.name }}",
"tags": {
"host": "{{ index .item.resource \"host.name\" }}",
"source": "{{ index .item.resource \"ed.source_name\" }}"
},
"fields": {
"value": {{ if eq .item._stat_type "value" }}{{ .item.gauge.value }}{{ else if eq .item._stat_type "sum" }}{{ .item.sum.value.sum }}{{ else }}0{{ end }}
},
"time": {{ .item.timestamp }}
}
Prometheus Remote Write (Simplified)
payload: |
{
"timeseries": [
{
"labels": {
"__name__": "{{ .item.name | replace \".\" \"_\" }}",
"host": "{{ index .item.resource \"host.name\" }}",
"source": "{{ index .item.resource \"ed.source_name\" }}",
"job": "edgedelta"
},
"samples": [
{
"value": {{ if eq .item._stat_type "value" }}{{ .item.gauge.value }}{{ else if eq .item._stat_type "sum" }}{{ .item.sum.value.sum }}{{ else }}0{{ end }},
"timestamp": {{ .item.timestamp }}
}
]
}
]
}
Splunk HEC Format
payload: |
{
"event": "metric",
"source": "edgedelta",
"sourcetype": "_json",
"host": "{{ index .item.resource \"host.name\" }}",
"time": {{ div .item.timestamp 1000 }},
"fields": {
"metric_name:{{ .item.name }}": {{ if eq .item._stat_type "value" }}{{ .item.gauge.value }}{{ else if eq .item._stat_type "sum" }}{{ .item.sum.value.sum }}{{ else }}0{{ end }},
"source_name": "{{ index .item.resource \"ed.source_name\" }}",
"agent_tag": "{{ index .item.resource \"ed.tag\" }}"
}
}
Authentication Methods
API Key Header
headers:
- header: X-API-Key
value: "${API_KEY}"
Bearer Token
headers:
- header: Authorization
value: "Bearer ${ACCESS_TOKEN}"
Basic Authentication
headers:
- header: Authorization
value: "Basic ${BASE64_CREDENTIALS}"
Where BASE64_CREDENTIALS is base64(username:password).
Batching Considerations
The webhook destination sends each metric as a separate HTTP request. For high-volume metrics, consider:
- Aggregation: Use EdgeDelta’s aggregation processors to reduce metric cardinality
- Sampling: Configure metric sampling to reduce volume
- Filtering: Only send critical metrics to the webhook
Error Handling
Retry Configuration
The webhook destination retries failed requests automatically. Consider your API’s rate limits when configuring suppression windows.
Handling API Errors
Monitor EdgeDelta agent logs for webhook errors:
# Check for webhook-related errors
grep -i "webhook" /var/log/edgedelta/agent.log
Testing
Local Mock Server
Test your payload with a local HTTP server:
# Python simple server that logs requests
python3 -c "
from http.server import HTTPServer, BaseHTTPRequestHandler
import json
class Handler(BaseHTTPRequestHandler):
def do_POST(self):
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
print(json.dumps(json.loads(body), indent=2))
self.send_response(200)
self.end_headers()
self.wfile.write(b'{\"status\": \"ok\"}')
HTTPServer(('localhost', 8080), Handler).serve_forever()
"
Validation Checklist
Before production deployment:
- Payload JSON is valid
- All template variables resolve correctly
- Authentication headers are correct
- Target API accepts the payload format
- Rate limits are understood and accounted for
Troubleshooting
| Issue | Solution |
|---|---|
400 Bad Request | Validate JSON syntax and required fields |
401 Unauthorized | Check authentication headers and credentials |
413 Payload Too Large | Reduce payload size or batch smaller |
429 Too Many Requests | Add suppression window or reduce metric volume |
| Missing data in payload | Verify template variable names match actual metric structure |
Best Practices
- Start with logging: Test payloads locally before production deployment
- Use environment variables: Never hardcode credentials in configurations
- Handle all metric types: Use conditionals for gauge, sum, and histogram
- Include metadata: Add source and timestamp for debugging
- Monitor delivery: Check agent logs for failed webhook deliveries
- Consider rate limits: Use appropriate suppression windows for high-volume metrics
See Also
- Webhook Destination - Full webhook reference
- Send Metrics to a Webhook - Extract metrics from logs
- Extract Metric Processor - Create metrics from logs