Redis Pack
4 minute read
Edge Delta Pipeline Pack for Redis
Overview
The Redis pack is designed to efficiently process Redis logs, providing comprehensive insights into database operations and performance. By ingesting logs and extracting vital fields, this pack structures data to enable targeted analysis based on log levels and database activities. It transforms informational logs into meaningful metrics, capturing both standard operations and error occurrences.
Pack Description
1. Data Ingestion
The data flow begins with the compound_input node. This node serves as the entry point into the pack, where it commences processing the incoming Redis logs.
2. Log Parsing
Logs are initially processed by a Grok node. This node uses a pattern to extract fields from the log entry, such as pid, role, timestamp, log_level, and message.
- name: grok
type: grok
pattern:
'%{INT:pid}:%{WORD:role} %{FULL_REDIS_TIMESTAMP:timestamp} (?:%{DATA:log_level}
)?%{GREEDYDATA:message}'
By converting unstructured logs into structured data, this node facilitates easier search and analysis. From an operational perspective, structured data simplifies the process of log querying and reporting, making it more efficient to identify and troubleshoot issues within Redis.
3. Timestamp Transformation
The parsed logs move to a Log Transform node, which updates the log timestamp to a Unix Milliseconds format via the convert_timestamp macro. This ensures consistency in timestamp formats across different logs.
- name: log_transform
type: log_transform
transformations:
- field_path: timestamp
operation: upsert
value:
convert_timestamp(item["attributes"]["timestamp"], "02 Jan 2006 15:04:05.000",
"Unix Milli")
By normalizing timestamps, you ensure an accurate timeline of log events, which is critical for correlating, analyzing, and diagnosing performance issues or other anomalies over time. See Manage Log Timestamps with Edge Delta.
4. Log Level Routing
The processed logs proceed to a Route node, which directs logs based on their level: error or info.
- name: route
type: route
paths:
- path: error
condition:
regex_match(item["body"], "(?i)ERROR") || item["attributes"]["log_level"]
== "#"
exit_if_matched: false
- path: info
condition:
regex_match(item["body"], "(?i)INFO") || item["attributes"]["log_level"]
== "*" || item["attributes"]["log_level"] == "-"
exit_if_matched: false
The condition parameter is evaluated using CEL expressions that regex match text in the log body and log_level attribute.
5. Error Pattern Extraction
Error logs are handled by a Log to Pattern node. This node identifies and extracts patterns from error logs, useful for pinpointing recurring issues or anomalies.
6. Log to Metric Conversion for Info Logs
Informational logs are directed to a Log to Metric node. It aggregates and converts structured log data into metrics, facilitating effective reporting and monitoring.
- name: log_to_metric
type: log_to_metric
pattern:
'(?P<database_id>DB \d+): (?P<keys_count>\d+) keys \((?P<volatile>\d+)
volatile\) in (?P<slots>\d+) slots'
interval: 1m0s
skip_empty_intervals: false
only_report_nonzeros: false
dimension_groups:
- dimensions:
- database_id
numeric_dimension: volatile
- dimensions:
- database_id
numeric_dimension: slots
- dimensions:
- database_id
numeric_dimension: keys_count
pattern: Helps in identifying relevant log lines to be converted into metrics.interval: Sets the reporting interval. The default is 1 minute.dimension_groups: Specifies the dimensions and numeric fields to group metrics, enabling detailed monitoring of Redis database performance metrics likevolatilekeys,slots, andkeys_count.
7. Error Log Output
Error logs are captured by the error_logs node, a compound_output node. This ensures that error logs are readily available for deeper analysis or alerting.
8. Informational Logs Output
The info_logs node, another compound_output node, captures informational logs. This structure enables easy segregation of log data based on the level of detail required.
9. Error Pattern Output
Patterns identified in error logs are directed to the error_patterns node, a compound_output node, facilitating pattern-based alerts or cumulative analysis of error patterns.
10. Info Metrics Output
Generated metrics from informational logs are routed to the info_metrics node, a compound_output node, enabling ongoing monitoring of Redis operational data and potentially identifying issues before they escalate.
11. Unmatched Logs Output
Logs that do not meet specific conditions are routed to the unmatched_logs node, another compound_output node, ensuring comprehensive log retention for additional processing or custom analysis.
Sample Input
4950:M 20 Sep 2024 18:30:54.838 # DEBUG: Memory usage exceeded threshold
2985:M 20 Sep 2024 18:30:55.839 # DEBUG: Database synchronization error
3628:M 20 Sep 2024 18:30:56.839 # ERROR: Failed to connect to master