Edge Delta Azure Blob Storage Output
2 minute read
See the latest version here.
This output type sends logs to an Azure Blob Storage endpoint.
Before you can create an output, you must have an account key. See this document from Microsoft.
Example
- name: my-blob
type: blob
account_name: {{ Env "BLOB_ACCOUNT_NAME" }}
account_key: {{ Env "BLOB_ACCOUNT_KEY" }}
container: testcontainer
auto_create_container: false
Parameters
name
Required
Enter a descriptive name for the output or integration.
For outputs, this name will be used to map this destination to a workflow.
name: my-blob
integration_name
Optional
This parameter refers to the organization-level integration created in the Integrations page.
If you need to add multiple instances of the same integration into the config, then you can add a custom name to each instance via the name parameter. In this situation, the name should be used to refer to the specific instance of the destination in the workflows.
integration_name: blob_acct
type: blob
Required
Enter blob.
type: blob
account_name
Required
Enter the account name for the Azure account.
account_name: '{{ Env "BLOB_ACCOUNT_NAME" }}'
container
Required
Enter the container name to upload.
container: testcontainer
auto_create_container
Optional
Create the container on the service, with no metadata and no public access.
auto_create_container: false
compression
Optional
Enter a compression type for archiving purposes.
You can enter gzip, zstd, snappy, or uncompressed.
compression: gzip
encoding
Optional
Enter an encoding type for archiving purposes.
You can enter json or parquet.
encoding: parquet
use_native_compression
Optional
Enter true or false to compress parquet-encoded data.
This option will not compress metadata.
This option can be useful with big data cloud applications, such as AWS Athena and Google BigQuery.
Note To use this parameter, you must set the encoding parameter to parquet.
use_native_compression: true
buffer_ttl
Optional
Enter a length of time to retry failed streaming data.
After this length of time is reached, the failed streaming data will no longer be tried.
buffer_ttl: 2h
buffer_path
Optional
Enter a folder path to temporarily store failed streaming data.
The failed streaming data will be retried until the data reaches its destinations or until the Buffer TTL value is reached.
If you enter a path that does not exist, then the agent will create directories, as needed.
buffer_path: /var/log/edgedelta/pushbuffer/
buffer_max_bytesize
Optional
Enter the maximum size of failed streaming data that you want to retry.
If the failed streaming data is larger than this size, then the failed streaming data will not be retried.
buffer_max_bytesize: 100MB