On Demand Log Forwarding
  • Dark
    Light

On Demand Log Forwarding

  • Dark
    Light

On Demand Log Forwarding is used to temporarily forward specific sets of raw data to Streaming Destinations for a given time period. Some examples include:

  • Forwarding logs for a given service for 30 minutes following a deployment.
  • Forwarding logs for a given cluster for 1 hour when an alert triggers.

Both the duration, as well as the log sources to be forwarded can be granularly defined to meet a number of use-cases. This feature is designed to be called remotely via API, which allows for automation for raw log forwarding from 3rd party applications, such as CI/CD pipeline tooling, 3rd party alerting systems, etc.

On demand log forwarding is handled by a type of conditional workflow. For more information about workflows, see the workflow page.


Configuring On Demand Log Forwarding

There are four steps to configuring on demand log forwarding:

  1. Plan the log requirements.
  2. Configure a streaming destination.
  3. Configure a workflow condition.
  4. Configure a workflow.

Each of these steps is discussed in more detail next.

Plan the Log Forwarding Requirements

You configure a workflow for a specific set of input labels, and create an API trigger for a specific set of input attributes. You need to examine the Inputs section in the agent yaml to determine which labels to include in your workflow, and which source attributes to include in your API call. Plan the on demand log requirements and take note of the identifying labels and source attributes.

The following source attributes can be used to identify the source:

Attribute Description
tag Edge Delta Agent tag
host Hostname where the Edge Delta Agent is running
src_type Source type: File, K8s, Docker, ECS etc.
environment Environment in "agent settings" section in ED Agent Config
app App in "agent settings" section in ED Agent Config
region Region in "agent settings" section in ED Agent Config

In addition, the following source type attributes can be used:

Source Type Attribute
Kubernetes
  • k8s_namespace
  • k8s_controller_kind
  • k8s_controller_logical_name
  • k8s_pod_name
  • k8s_container_name
  • k8s_container_image
Amazon Elastic Container Service
  • ecs_cluster
  • ecs_container
  • ecs_task_family
  • ecs_task_version
Docker
  • docker_container_name
  • docker_image
File
  • file_glob_path

You may want to configure multiple conditional workflows and plan multiple different API calls for each of them to cater for different scenarios with different log forwarding needs.

Configure a Streaming Destination for On Demand Log Forwarding

We recommend creating a dedicated streaming destination just for on demand log forwarding. It should have only the ‘log' feature enabled.
Configure the log feature when creating a destination for on demand logs.

Note

A streaming destination must have the 'log’ feature enabled for on demand log streaming to work.

See the Streaming Outputs page for instructions on configuring a streaming destination.

Configure a Workflow Condition

Create a condition entry in the agent yaml that will be referred to in the workflow, which in turn will be triggered by the API call. To create the condition entry, start a root section called workflow_conditions. Give it a name and specify the type on_demand. In the following example the condition is named alert_on_my_service.

workflow_conditions:
 - name: alert_on_my_service
   type: on_demand

Configure a Workflow

Create a conditional workflow in the agent yaml that references the workflow condition you created earlier, in that case alert_on_my_service. Specify the input labels you identified in your input configuration, in this example errorcheck. Finally, in the destinations section, specify the on demand log streaming destination you configured earlier.

  conditional-workflow:
    input_labels:
      - errorcheck
    conditions:
      - alert_on_my_service
    destinations:
      - '{{ Env "SUMO36" }}'

The conditional workflow has now been configured for on demand forwarding. It will output to the specified destination when it is successfully triggered by the API call.

Note

The API call's condition name AND all the attributes associated with the input label must match the workflow configuration to be successfully triggered. If one attribute does not match the workflow won't be triggered.


Triggering On Demand Log Forwarding

Plan the On Demand Log Requirements

Now that you have configured on demand log forwarding you can trigger it. First you should plan your API call:

  • Obtain the Organization ID
  • Obtain the API token that will authorize you to call the on_demand API endpoint.
  • Identify the workflow.
  • Identify the source attributes that you are interested in.
  • Determine how long you want the workflow to continue sending logs.

Obtain the Organization ID

You need your organization ID to call the on_demand end point:

  1. In the Edge Delta web app click Management - My Organization.
  2. Copy the Organization ID.

Obtain an API Token

The on_demand API endpoint can be accessed using an API token. You generate a token in the Edge Delta web app:

  1. Select My Organization and click API Tokens.
  2. Click Create Token.
  3. Select Add Permissions.
  4. Configure the Agent Configuration - All Resources - Write permission and click Add to Token.
  5. Click Token Details, add a token name and click Create.
  6. Copy the token value

Identify the Workflow

You may have multiple on-demand workflows configured for different scenarios. You need to call the workflow that is configured for the logs you want to see. Examine your agent configuration to determine which on demand conditional workflow to call and take note of the workflow_conditions name that was referenced in the conditions section of the workflow.

Identify the Source Attributes

Identify the source attributes that you are interested in. To avoid receiving too many irrelevant logs, specify the attributes for exactly the sources that generate the logs you need to see. The attributes you can specify are listed in the table in the preceding Plan the Log Forwarding Requirements section.

Note

The source attributes must be associated with the input labels you specified in the workflow you are calling.

Identify the Forwarding Duration

Determine how long you want the workflow to continue sending logs once it has been triggered. Bear in mind the timeframe starts when the workflow event is created by calling the API, but the log forwarding will only start when the agent polls the API. There may be a delay of up to 1 minute between polling periods when the agent checks for triggers. Therefore, you may want to add one minute to your forwarding duration to account for the maximum polling delay.

Compose and Execute the API Call

You can execute an API call to start log forwarding.

  1. Compose a POST API call.
  2. Specify the on_demand endpoint and include your organization ID.
http://api.edgedelta.com/v1/orgs/<Organization_ID>/on_demand
  1. Create a header field for the X-ED-API-Token key and specify your token as the value.
  2. In the request body, specify the name previously configured as a condition parameter in the workflow yaml.
  3. Specify the matching attributes for the source or input
  4. Specify the duration you want the workflow to be active.

If the call is successful, you will get a 201 response.

Note

You may have to wait up to 60 seconds for the agent to trigger because it polls the API every minute to check for matching calls.

API Call Example

The following example illustrates the triggering API call. The organization ID and token value have been replaced with placeholders. It triggers the alert_on_my_service workflow, which specified the errorcheck input label. If there are matching attributes for the inputs with that label, namely a kubernetes source with a namespace of v1env and a tag of prod-2, then the workflow will forward logs for 5 minutes from the moment it executes, which will occur within one minute.

curl --location --request POST 'http://api.edgedelta.com/v1/orgs/<organization_ID>/on_demand' \--header 'X-ED-API-Token: <token_value> \--header 'Content-Type: application/json' \--data-raw '{"name": "alert_on_my_service","attributes": {"k8s_namespace": "v1env","src_type": "K8s","tag": "prod-2"},"future_duration": "5m"}'

The following example illustrates a pod log from an on demand trigger. Note how the future duration is less than the configured 5 minutes due to the polling delay of 44.5 seconds:

2022-09-21T18:44:09.673Z	INFO	logprocessors/on_demand_processor.go:208	log-forwarder-pipe-wf-temp-snito-sandbox-2|stat|ip-10-0-0-21.us-west-1.compute.internal|K8s|v1env#mocha-5ffc66c689-h295x#mocha-workflow-testing-on-demand-processor extended forwardUntil to 2022-09-21 18:48:25.193 +0000 UTC with calculated max future duration 4m15.519275197s

Was this article helpful?

What's Next
Changing your password will log you out immediately. Use the new password to log back in.
First name must have atleast 2 characters. Numbers and special characters are not allowed.
Last name must have atleast 1 characters. Numbers and special characters are not allowed.
Enter a valid email
Enter a valid password
Your profile has been successfully updated.