Edge Delta AWS S3 Output

Stream data to AWS S3.

Overview

The AWS S3 output will stream analytics and insights to an S3 bucket.

Example

    - name: my-s3-streamer
      type: s3stream
      aws_key_id: {{ Env "AWS_KEY_ID" }}
      aws_sec_key: {{ Env "AWS_SECRET_KEY" }}
      bucket: testbucket
      region: us-east-2
      flush_interval: 30s 
      flush_bytesize: 1M 

Parameters

name

Required

Enter a descriptive name for the output or integration.

For outputs, this name will be used to map this destination to a workflow.

name: my-s3-streamer

integration_name

Optional

This parameter refers to the organization-level integration created in the Integrations page.

If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name parameter. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.

integration_name: orgs-s3stream

type

Required

Enter s3stream.

type: s3stream

features

Optional

This parameter defines which data types to stream to the destination.

To learn more, see the following section on supported feature types.

features: metric

bucket

Required

Enter the target S3 bucket.

bucket: testbucket

region

Required

Enter the specified S3 bucket’s region.

region: us-east-2

aws_key_id

Optional

Enter the AWS key ID associated with the specified bucket.

aws_key_id: '{{ Env "AWS_KEY_ID" }}'

aws_sec_key

Optional

Enter the AWS secret key ID associated with the specified bucket.

aws_sec_key: '{{ Env "AWS_SECRET_KEY" }}'

role_arn

Optional

To assume an AWS IAM role, enter the account ID and role name.

role_arn: "arn:aws:iam::<ACCOUNT_ID>:role/<ROLE_NAME>"

external_id

Optional

Enter a unique identifier to avoid a confused deputy attack.

external_id: "053cf606-8e80-47bf-b849-8cd1cc826cfc"

flush_interval

Optional

Enter a time to flush (or force) data to the destination, including buffered data.

flush_interval: 30s

flush_bytesize

Optional

Enter a data threshold to flush (or force) data to the destination, including buffered data.

flush_bytesize: 1M

buffer_ttl

Optional

Enter a length of time to retry failed streaming data.

After this length of time is reached, the failed streaming data will no longer be tried.

buffer_ttl: 2h

buffer_path

Optional

Enter a folder path to temporarily store failed streaming data.

The failed streaming data will be retried until the data reaches its destinations or until the Buffer TTL value is reached.

If you enter a path that does not exist, then the agent will create directories, as needed.

buffer_path: /var/log/edgedelta/pushbuffer/

buffer_max_bytesize

Optional

Enter the maximum size of failed streaming data that you want to retry.

If the failed streaming data is larger than this size, then the failed streaming data will not be retried.

buffer_max_bytesize: 100MB

custom_tags

Optional

This parameter defines key-value pairs that are streamed with every request.

custom_tags:
  "Host": "{{.Host}}"
  "Source": "{{.Source}}"
  "SourceType": "{{.SourceType}}"
  "Tag": "{{.Tag}}"

Supported Features

See Streaming Features.

Feature Type Supported?
Log Yes
Metrics No
Alert as event No
Alert as log No
Health No
Dimensions as attribute No
Send as is No
Send as JSON No
Custom tags No
EDAC enrichment No
Message template No
outgoing_bytes.sum Yes
outgoing__raw_bytes.sum Yes
outgoing_lines.count No
output buffering to disk No