Ingest Audit Logs from Edge Delta Platform

Configure Edge Delta pipelines to ingest audit logs from the Edge Delta platform using the Search API for forwarding to SIEM and security analytics platforms.

Overview

Edge Delta audit logs capture user activity within the Edge Delta platform, providing visibility for compliance, security monitoring, and troubleshooting. While these logs are available through the Edge Delta web interface, you can also ingest them into Edge Delta pipelines to:

  • Forward to SIEM platforms for centralized security monitoring
  • Route to multiple destinations such as data lakes, compliance systems, and security analytics tools
  • Transform and enrich audit data before forwarding
  • Correlate audit logs with other telemetry data
  • Apply custom processing and filtering rules

This integration uses the Edge Delta Search API to pull audit logs into a pipeline, where they can be processed and routed like any other telemetry data.

Understanding the Pipeline Processing

The pipeline processes audit logs through several stages:

1. API Response Structure

The Edge Delta Search API returns audit logs in this format:

{
  "items": [
    {
      "body": "Audit log message",
      "severity_text": "INFO",
      "timestamp": "2024-01-15T10:30:00Z",
      "attributes": {
        "user": "user@example.com",
        "action": "login",
        "path": "/api/v1/resource"
      }
    }
  ]
}

2. Processing Steps

StepProcessorPurpose
1JSON UnrollExtracts individual log entries from the items array
2Parse JSONParses each log entry and merges fields into attributes
3Copy FieldsPromotes body, severity_text, and timestamp to log record fields
4Delete FieldsRemoves redundant fields and HTTP response headers
Screenshot Screenshot

3. Final Log Format

After processing, each audit log entry becomes a structured log with:

  • Body: The audit log message
  • Severity: Log level (INFO, WARN, ERROR, etc.)
  • Timestamp: When the action occurred
  • Attributes: All metadata fields (user, action, path, method, etc.)

Prerequisites

Before configuring audit log ingestion, ensure you have:

  1. Audit logging enabled in your Edge Delta organization (see Audit Logs)

Step 1: Create an API Token

You need an API token with specific permissions to access audit logs via the Search API.

Create the Token

  1. Click Admin and select the My Organization tab.
  2. Click the API Tokens tab and click Create Token.
  3. Enter a descriptive token name (e.g., audit-log-ingestion).

Configure Permissions

Add the following permission to the token:

FieldValue
Resource TypeSearch
Access TypeRead
Resource InstanceAll current and future search resources
Screenshot Screenshot
  1. Copy the access token and store it securely. You will need this for the pipeline configuration.

For more details on API token management, see Manage API Tokens.

Step 2: Get Your Organization ID

  1. In the Edge Delta Admin console, click Admin and select the My Organization tab.
  2. Copy your Organization ID from the page.
Screenshot Screenshot

Step 3: Configure the Pipeline

Configure an Edge Delta pipeline to ingest audit logs using the HTTP Pull source. You can create the pipeline through the Edge Delta GUI or using YAML configuration.

Step 3.1: Add the HTTP Pull Source

In the Edge Delta pipeline editor, add an HTTP Pull source node with the following configuration:

Screenshot Screenshot
- name: audit_logs_source
  type: http_pull_input
  user_description: Edge Delta Platform Audit Logs
  endpoint: https://api.edgedelta.com/v1/orgs/<YOUR_ORG_ID>/logs/log_search/search
  headers:
  - header: Accept
    value: application/json
  - header: X-ED-API-Token
    value: <YOUR_API_TOKEN>
  method: GET
  parameters:
  - name: scope
    value: audit
  - name: lookback
    value: 2m
  pull_interval: 2m0s

Configuration Parameters:

FieldValueDescription
endpointhttps://api.edgedelta.com/v1/orgs/<YOUR_ORG_ID>/logs/log_search/searchReplace <YOUR_ORG_ID> with your Organization ID
X-ED-API-Token<YOUR_API_TOKEN>Replace with your API token from Step 1
scopeauditFilters for audit logs only
lookback2mRetrieves logs from the last 2 minutes
pull_interval2m0sPulls data every 2 minutes

Important: Ensure the lookback duration matches or slightly exceeds the pull_interval to avoid gaps in data collection.

Step 3.2: Add Processors to Transform Audit Logs

After adding the source, connect it to a processor sequence that extracts and transforms the audit log data.

Add the following processors in order:

1. JSON Unroll Processor

Extracts individual audit log entries from the API response array:

Screenshot Screenshot
- type: json_unroll
  data_types:
  - log
  field_path: body
  new_field_name: items
  json_field_path: items

2. Parse JSON Transform

Parses each log entry and merges fields into attributes:

Screenshot Screenshot
- type: ottl_transform
  data_types:
  - log
  statements: |-
    set(cache["parsed-json"], ParseJSON(body)["items"])
    merge_maps(attributes, cache["parsed-json"], "upsert") where IsMap(attributes) and IsMap(cache["parsed-json"])
    set(attributes, cache["parsed-json"]) where not (IsMap(attributes) and IsMap(cache["parsed-json"]))    

3. Copy Fields Transform

Promotes important fields to the log record level:

Screenshot Screenshot
- type: ottl_transform
  data_types:
  - log
  statements: |-
    set(body, attributes["body"])
    set(severity_text, attributes["severity_text"])
    set(timestamp, attributes["timestamp"])    

4. Cleanup Transform

Removes redundant fields:

Screenshot Screenshot
- type: ottl_transform
  data_types:
  - log
  statements: |-
    delete_key(attributes, "severity_text")
    delete_key(attributes, "body")
    delete_key(attributes, "timestamp")
    delete_key(attributes, "http.response.header")    

Step 3.3: Add Destination(s)

Connect the processor output to one or more destination nodes:

Monitoring and Troubleshooting

Verify Data Flow

  1. Check agent logs for HTTP pull activity:

    kubectl logs -n edgedelta <pod-name> | grep "http_pull"
    
  2. Monitor the Edge Delta console for incoming audit logs in your destination.

Common Issues

IssueCauseSolution
No data flowingAPI token invalid or expiredRegenerate token with correct permissions
Missing logsLookback period too shortIncrease lookback to match or exceed pull interval
Duplicate logsOverlapping lookback periodsEnsure lookback matches pull interval exactly
Authentication errorsWrong Organization IDVerify Organization ID in Admin console
Rate limitingPull interval too frequentIncrease pull interval (minimum recommended: 1m)

API Endpoint Details

The audit log ingestion uses the Edge Delta Search API endpoint:

GET https://api.edgedelta.com/v1/orgs/{org_id}/logs/log_search/search

You can test it manually.

Query Parameters:

ParameterRequiredDescription
scopeYesMust be set to audit to retrieve audit logs
lookbackYesDuration in Go format (e.g., 2m, 5m, 1h)
limitNoMaximum number of logs to return (default: 1000)
orderNoSort order: ASC or DESC (default: DESC)

Headers:

HeaderRequiredDescription
X-ED-API-TokenYesYour API token with Search read permissions
AcceptYesSet to application/json

For complete API documentation, see API Reference.

Security Best Practices

  1. Use least privilege: Create dedicated API tokens for audit log ingestion with only the required Search read permissions.

  2. Rotate tokens regularly: Implement a token rotation policy and update pipeline configurations accordingly.

  3. Monitor token usage: Track API token usage in the Edge Delta Admin console to detect anomalies.

  4. Encrypt in transit: The API endpoint uses HTTPS to ensure audit logs are encrypted during transmission.

Next Steps

  • Configure alert rules on audit log patterns to detect security events
  • Set up dashboards to visualize audit activity
  • Integrate with your SIEM platform for correlation with other security events
  • Explore OTTL processors for advanced audit log transformation