OTTL Time Conversion Guide
10 minute read
Overview
Time conversion is a critical aspect of telemetry data processing. Different systems use different timestamp formats, and EdgeDelta uses UnixMilli (milliseconds since Unix epoch, January 1, 1970 UTC) as its internal timestamp format. This guide provides testable examples for converting between formats commonly used in enterprise integrations.
Key Concepts
EdgeDelta Timestamp Format
- Native Format: UnixMilli (int64) - milliseconds since Unix epoch
- Example:
1735790400000represents2025-01-02 00:00:00 UTC
Common Time Functions
- Time(): Converts string to time.Time
- UnixMilli(): Converts time.Time to milliseconds since epoch
- Format(): Formats values into strings
- Now(): Returns current system time
- Duration(): Converts string to time.Duration
- TruncateTime(): Rounds time down to specified precision
Use Case 1: Convert UnixMilli to ServiceNow Format
ServiceNow APIs expect datetime in format: yyyy-MM-dd HH:mm:ss (UTC timezone).
Example: Convert EdgeDelta timestamp to ServiceNow datetime string
Input
{
"_type": "log",
"timestamp": 1735790400000,
"body": "User login event",
"resource": {},
"attributes": {}
}
Statement
set(cache["time_obj"], Time("1970-01-01T00:00:00Z", "%Y-%m-%dT%H:%M:%SZ"))
set(cache["time_obj_with_offset"], TruncateTime(cache["time_obj"], Duration(Format("%dms", [timestamp]))))
set(attributes["servicenow_datetime"], Format("%04d-%02d-%02d %02d:%02d:%02d", [Year(cache["time_obj_with_offset"]), Month(cache["time_obj_with_offset"]), Day(cache["time_obj_with_offset"]), Hour(cache["time_obj_with_offset"]), Minute(cache["time_obj_with_offset"]), Second(cache["time_obj_with_offset"])]))
Output
{
"_type": "log",
"timestamp": 1735790400000,
"body": "User login event",
"resource": {},
"attributes": {
"servicenow_datetime": "2025-01-02 00:00:00"
}
}
Explanation: This converts the UnixMilli timestamp (1735790400000) into ServiceNow’s required format “2025-01-02 00:00:00”. The conversion creates a time.Time object from epoch, adds the timestamp duration, then formats it using Year(), Month(), Day(), Hour(), Minute(), and Second() extractors.
Alternative: Arithmetic-Only Conversion (No Time Functions)
For environments where the Time() function may be unavailable or for maximum portability, you can use pure arithmetic to convert UnixMilli to ServiceNow format. This approach manually calculates date/time components from the Unix timestamp.
Example: Convert UnixMilli to ServiceNow using only arithmetic
Input
{
"_type": "log",
"timestamp": 1735790400000,
"body": "User login event",
"resource": {
"raw_data": {
"date": "1735790400000"
}
},
"attributes": {}
}
Statement
set(cache["ts_ms"], Int(resource["raw_data"]["date"])) where resource["raw_data"]["date"] != nil and resource["raw_data"]["date"] != ""
set(cache["ts_sec"], Int(cache["ts_ms"] / 1000)) where cache["ts_ms"] != nil and cache["ts_ms"] > 0
set(cache["days_since_epoch"], Int(cache["ts_sec"] / 86400)) where cache["ts_sec"] != nil
set(cache["seconds_in_day"], cache["ts_sec"] - (cache["days_since_epoch"] * 86400)) where cache["days_since_epoch"] != nil
set(cache["hours"], Int(cache["seconds_in_day"] / 3600)) where cache["seconds_in_day"] != nil
set(cache["remaining_after_hours"], cache["seconds_in_day"] - (cache["hours"] * 3600)) where cache["hours"] != nil
set(cache["minutes"], Int(cache["remaining_after_hours"] / 60)) where cache["remaining_after_hours"] != nil
set(cache["seconds"], cache["remaining_after_hours"] - (cache["minutes"] * 60)) where cache["minutes"] != nil
set(cache["years_since_1970"], Int(cache["days_since_epoch"] / 365.25)) where cache["days_since_epoch"] != nil
set(cache["year"], 1970 + cache["years_since_1970"]) where cache["years_since_1970"] != nil
set(cache["leap_days"], Int(cache["years_since_1970"] / 4)) where cache["years_since_1970"] != nil
set(cache["day_of_year"], cache["days_since_epoch"] - (cache["years_since_1970"] * 365) - cache["leap_days"]) where cache["leap_days"] != nil
set(cache["month"], Int(cache["day_of_year"] / 30.44) + 1) where cache["day_of_year"] != nil and cache["day_of_year"] >= 0
set(cache["day"], Int(cache["day_of_year"] - ((cache["month"] - 1) * 30.44)) + 1) where cache["month"] != nil
set(attributes["time_of_event"], Format("%04d-%02d-%02d %02d:%02d:%02d", [cache["year"], cache["month"], cache["day"], cache["hours"], cache["minutes"], cache["seconds"]])) where cache["year"] != nil and cache["month"] != nil and cache["day"] != nil and cache["hours"] != nil and cache["minutes"] != nil and cache["seconds"] != nil
Output
{
"_type": "log",
"timestamp": 1735790400000,
"body": "User login event",
"resource": {
"raw_data": {
"date": "1735790400000"
}
},
"attributes": {
"time_of_event": "2025-01-02 00:00:00"
}
}
Explanation: This arithmetic-only approach converts UnixMilli to ServiceNow format without using Time() functions. It works by:
- Converting to seconds:
ts_ms / 1000→ seconds since Unix epoch - Extracting days:
ts_sec / 86400→ total days since 1970-01-01 - Calculating time components: Breaking down seconds within the day to get hours, minutes, seconds
- Calculating date components:
- Years:
days / 365.25(accounting for leap years) - Leap days:
years / 4(simple leap year approximation) - Day of year: Total days minus (years × 365) minus leap days
- Month:
day_of_year / 30.44(average days per month) - Day: Remaining days after accounting for months
- Years:
When to use this approach:
- When Time() function is not available or causes issues
- When you need maximum portability across different OTTL implementations
- When you want explicit control over the conversion logic
- For debugging timestamp conversion issues
Limitations:
- Month/day calculations use averages (30.44 days/month) so exact dates may be off by 1-2 days
- Leap year calculation is simplified (doesn’t account for century rules)
- Best suited for timestamps that don’t require exact date precision
- For production use with strict accuracy requirements, prefer the Time() function approach
Use Case 2: Convert ISO8601 String to UnixMilli
Example: Parse ISO8601 timestamp from log and convert to EdgeDelta format
Input
{
"_type": "log",
"timestamp": 1700000000000,
"body": "Application started at 2025-01-02T15:30:45Z",
"resource": {},
"attributes": {
"log_timestamp": "2025-01-02T15:30:45Z"
}
}
Statement
set(timestamp, UnixMilli(Time(attributes["log_timestamp"], "%Y-%m-%dT%H:%M:%SZ")))
Output
{
"_type": "log",
"timestamp": 1735833045000,
"body": "Application started at 2025-01-02T15:30:45Z",
"resource": {},
"attributes": {
"log_timestamp": "2025-01-02T15:30:45Z"
}
}
Explanation: The Time() function parses the ISO8601 string using the format pattern, then UnixMilli() converts it to EdgeDelta’s native millisecond format. The timestamp field is updated from 1700000000000 to 1735833045000.
Use Case 3: Convert RFC3339 with Timezone to UnixMilli
RFC3339 is the recommended format for internet protocols and includes timezone offsets.
Example: Parse RFC3339 timestamp with timezone
Input
{
"_type": "log",
"timestamp": 1700000000000,
"body": "Event logged",
"resource": {},
"attributes": {
"event_time": "2025-01-02T15:30:45.123456-08:00"
}
}
Statement
set(timestamp, UnixMilli(Time(attributes["event_time"], "%Y-%m-%dT%H:%M:%S.%f%z")))
Output
{
"_type": "log",
"timestamp": 1735861845123,
"body": "Event logged",
"resource": {},
"attributes": {
"event_time": "2025-01-02T15:30:45.123456-08:00"
}
}
Explanation: The %z pattern handles the timezone offset (-08:00), and %f captures microseconds. The timestamp 1735861845123 represents the correct UTC time after timezone conversion.
Use Case 4: Convert UnixMilli to Splunk Format
Splunk recommends RFC3339 format: %Y-%m-%dT%T.%6N%:z
Example: Format timestamp for Splunk ingestion
Input
{
"_type": "log",
"timestamp": 1735833045123,
"body": "Metric data point",
"resource": {},
"attributes": {
"value": 42.5
}
}
Statement
set(cache["epoch"], Time("1970-01-01T00:00:00Z", "%Y-%m-%dT%H:%M:%SZ"))
set(cache["event_time"], TruncateTime(cache["epoch"], Duration(Format("%dms", [timestamp]))))
set(cache["micros"], Int((timestamp - (Int(timestamp / 1000) * 1000)) * 1000))
set(attributes["splunk_time"], Format("%04d-%02d-%02dT%02d:%02d:%02d.%06dZ", [Year(cache["event_time"]), Month(cache["event_time"]), Day(cache["event_time"]), Hour(cache["event_time"]), Minute(cache["event_time"]), Second(cache["event_time"]), cache["micros"]]))
Output
{
"_type": "log",
"timestamp": 1735833045123,
"body": "Metric data point",
"resource": {},
"attributes": {
"value": 42.5,
"splunk_time": "2025-01-02T15:30:45.123000Z"
}
}
Explanation: Converts UnixMilli to Splunk’s recommended RFC3339 format with microsecond precision. The format includes date, time, and milliseconds converted to microseconds (123ms → 123000μs).
Use Case 5: Extract Time Components for Filtering
Example: Extract hour and day for business hours filtering
Input
{
"_type": "log",
"timestamp": 1735822245000,
"body": "API request received",
"resource": {},
"attributes": {
"endpoint": "/api/users"
}
}
Statement
set(cache["ts_time"], Time("1970-01-01T00:00:00Z", "%Y-%m-%dT%H:%M:%SZ"))
set(cache["ts_time"], TruncateTime(cache["ts_time"], Duration(Format("%dms", [timestamp]))))
set(attributes["hour"], Hour(cache["ts_time"]))
set(attributes["day_of_week"], Weekday(cache["ts_time"]))
set(attributes["is_business_hours"], EDXIfElse(attributes["hour"] >= 9 and attributes["hour"] < 17 and attributes["day_of_week"] >= 1 and attributes["day_of_week"] <= 5, true, false))
Output
{
"_type": "log",
"timestamp": 1735822245000,
"body": "API request received",
"resource": {},
"attributes": {
"endpoint": "/api/users",
"hour": 12,
"day_of_week": 4,
"is_business_hours": true
}
}
Explanation: Extracts hour (12) and weekday (4=Thursday) from the timestamp, then uses EDXIfElse to determine if it falls within business hours (9 AM - 5 PM, Monday-Friday).
Use Case 6: Calculate Time Differences and Durations
Example: Calculate request duration in milliseconds and hours
Input
{
"_type": "log",
"timestamp": 1735833045000,
"body": "Request completed",
"resource": {},
"attributes": {
"request_start": 1735829445000
}
}
Statement
set(attributes["duration_ms"], timestamp - attributes["request_start"])
set(attributes["duration_hours"], Milliseconds(Duration(Format("%dms", [attributes["duration_ms"]]))) / 3600000)
Output
{
"_type": "log",
"timestamp": 1735833045000,
"body": "Request completed",
"resource": {},
"attributes": {
"request_start": 1735829445000,
"duration_ms": 3600000,
"duration_hours": 1.0
}
}
Explanation: Calculates duration by subtracting start time from end time, resulting in 3600000ms (1 hour). The Hours() function converts the duration for human readability.
Use Case 7: Truncate Time to Hour/Day/Week Boundaries
Example: Round timestamp down to start of hour for time-series bucketing
Input
{
"_type": "log",
"timestamp": 1735833045123,
"body": "Metric point",
"resource": {},
"attributes": {
"cpu_usage": 75.3
}
}
Statement
set(cache["ts_time"], Time("1970-01-01T00:00:00Z", "%Y-%m-%dT%H:%M:%SZ"))
set(cache["ts_time"], TruncateTime(cache["ts_time"], Duration(Format("%dms", [timestamp]))))
set(cache["ts_time_truncated"], TruncateTime(cache["ts_time"], Duration("1h")))
set(attributes["hour_bucket"], UnixMilli(cache["ts_time_truncated"]))
Output
{
"_type": "log",
"timestamp": 1735833045123,
"body": "Metric point",
"resource": {},
"attributes": {
"cpu_usage": 75.3,
"hour_bucket": 1735830000000
}
}
Explanation: The TruncateTime() function rounds the timestamp (15:30:45) down to the start of the hour (15:00:00), creating 1735830000000. This is useful for time-series aggregation and bucketing metrics.
Use Case 8: Add Current Timestamp for Processing Time
Example: Add processing timestamp to track pipeline latency
Input
{
"_type": "log",
"timestamp": 1735829445000,
"body": "Original event",
"resource": {},
"attributes": {}
}
Statement
set(attributes["processed_at"], UnixMilli(Now()))
set(attributes["processing_delay_ms"], attributes["processed_at"] - timestamp)
Output
{
"_type": "log",
"timestamp": 1735829445000,
"body": "Original event",
"resource": {},
"attributes": {
"processed_at": 1735833045000,
"processing_delay_ms": 3600000
}
}
Explanation: Now() returns the current system time, UnixMilli() converts it to milliseconds, and the delay is calculated showing the event was processed 3600000ms (1 hour) after it occurred.
Use Case 9: Convert Human-Readable Duration to Milliseconds
Example: Parse duration string from log and convert to milliseconds
Input
{
"_type": "log",
"timestamp": 1735833045000,
"body": "Session timeout after 2h30m15s",
"resource": {},
"attributes": {
"session_duration": "2h30m15s"
}
}
Statement
set(attributes["duration_ms"], Int(Milliseconds(Duration(attributes["session_duration"]))))
Output
{
"_type": "log",
"timestamp": 1735833045000,
"body": "Session timeout after 2h30m15s",
"resource": {},
"attributes": {
"session_duration": "2h30m15s",
"duration_ms": 9015000
}
}
Explanation: Duration() parses the human-readable duration “2h30m15s”, Milliseconds() converts it to milliseconds (9015000), and Int() ensures it’s stored as an integer.
Use Case 10: Convert UnixMilli to Multiple Formats for Multi-System Integration
Example: Single timestamp converted for ServiceNow, Splunk, and Elasticsearch
Input
{
"_type": "log",
"timestamp": 1735833045123,
"body": "Critical security event",
"resource": {},
"attributes": {
"event_type": "authentication_failure"
}
}
Statement
// Create base time object
set(cache["epoch"], Time("1970-01-01T00:00:00Z", "%Y-%m-%dT%H:%M:%SZ"))
set(cache["event_time"], TruncateTime(cache["epoch"], Duration(Format("%dms", [timestamp]))))
// ServiceNow format: yyyy-MM-dd HH:mm:ss
set(attributes["servicenow_time"], Format("%04d-%02d-%02d %02d:%02d:%02d", [Year(cache["event_time"]), Month(cache["event_time"]), Day(cache["event_time"]), Hour(cache["event_time"]), Minute(cache["event_time"]), Second(cache["event_time"])]))
// Splunk format: RFC3339 with microseconds
set(cache["micros"], Int((timestamp - (Int(timestamp / 1000) * 1000)) * 1000))
set(attributes["splunk_time"], Format("%04d-%02d-%02dT%02d:%02d:%02d.%06dZ", [Year(cache["event_time"]), Month(cache["event_time"]), Day(cache["event_time"]), Hour(cache["event_time"]), Minute(cache["event_time"]), Second(cache["event_time"]), cache["micros"]]))
// Elasticsearch format: ISO8601 with milliseconds
set(attributes["elasticsearch_time"], Format("%04d-%02d-%02dT%02d:%02d:%02d.%03dZ", [Year(cache["event_time"]), Month(cache["event_time"]), Day(cache["event_time"]), Hour(cache["event_time"]), Minute(cache["event_time"]), Second(cache["event_time"]), Int(timestamp - (Int(timestamp / 1000) * 1000))]))
Output
{
"_type": "log",
"timestamp": 1735833045123,
"body": "Critical security event",
"resource": {},
"attributes": {
"event_type": "authentication_failure",
"servicenow_time": "2025-01-02 15:30:45",
"splunk_time": "2025-01-02T15:30:45.123000Z",
"elasticsearch_time": "2025-01-02T15:30:45.123Z"
}
}
Explanation: This comprehensive example converts a single UnixMilli timestamp into three different formats needed for common SIEM and ticketing integrations. Each format follows the target system’s API requirements.
Time Format Patterns Reference
When using the Time() function to parse datetime strings, use these format patterns:
| Pattern | Description | Example |
|---|---|---|
%Y | 4-digit year | 2025 |
%m | 2-digit month | 01 |
%d | 2-digit day | 02 |
%H | 2-digit hour (24h) | 15 |
%M | 2-digit minute | 30 |
%S | 2-digit second | 45 |
%f | Microseconds | 123456 |
%z | Timezone offset | +00:00, -08:00 |
%Z | Timezone name | UTC, PST |
%T | Shorthand for %H:%M:%S | 15:30:45 |
Common Format Examples:
- ISO8601:
%Y-%m-%dT%H:%M:%SZ - RFC3339:
%Y-%m-%dT%H:%M:%S.%f%z - ServiceNow:
%Y-%m-%d %H:%M:%S - Custom Log:
%m/%d/%Y %H:%M:%S
Duration Format Reference
When using the Duration() function, these units are supported:
h- hours (e.g., “2h”)m- minutes (e.g., “30m”)s- seconds (e.g., “45s”)ms- milliseconds (e.g., “500ms”)usorµs- microseconds (e.g., “1000us”)ns- nanoseconds (e.g., “1000000ns”)
Combination Examples:
"2h30m"- 2 hours 30 minutes"1h30m45s"- 1 hour 30 minutes 45 seconds"500ms"- 500 milliseconds"90s"- 90 seconds (1 minute 30 seconds)
Best Practices
1. Always Use UTC for Timestamp Storage
Store all timestamps in UTC (UnixMilli format) and convert to local timezones only when displaying to users or integrating with systems that require specific timezones.
2. Use cache for Intermediate Time Objects
Time conversions can be computationally expensive. Store intermediate time.Time objects in cache to avoid redundant conversions:
set(cache["event_time"], Time("1970-01-01T00:00:00Z", "%Y-%m-%dT%H:%M:%SZ"))
set(cache["event_time"], TruncateTime(cache["event_time"], Duration(Format("%dms", [timestamp]))))
// Now reuse cache["event_time"] for multiple extractions
set(attributes["hour"], Hour(cache["event_time"]))
set(attributes["day"], Day(cache["event_time"]))
3. Validate Timestamp Ranges
Add validation to prevent invalid timestamps:
set(attributes["valid_timestamp"], EDXIfElse(timestamp > 0 and timestamp < 9999999999999, true, false)) where attributes["valid_timestamp"] == true
4. Handle Missing Timestamps Gracefully
Use EDXCoalesce to provide fallback timestamps:
set(timestamp, EDXCoalesce(UnixMilli(Time(attributes["log_time"], "%Y-%m-%dT%H:%M:%SZ")), UnixMilli(Now())))
5. Document Your Time Format Assumptions
Add comments or attributes to clarify timezone assumptions:
set(attributes["timestamp_timezone"], "UTC")
set(attributes["timestamp_source"], "application_log")
Related Documentation
- Manage Log Timestamps - Complete guide to timestamp handling in Edge Delta
- Parse Timestamp Processor - Dedicated processor for timestamp parsing
- Format Converter - String formatting function
- Time and UnixMilli - Core time conversion functions
- Duration Functions - Duration manipulation
- OTTL Statements Guide - General OTTL syntax and patterns
Testing Your Time Conversions
All examples in this guide are designed to be testable in the Edge Delta Visual Pipeline Builder:
- Create a new pipeline or use an existing one
- Add an OTTL Transform processor
- Copy the Input JSON into a test event
- Paste the Statement into the processor
- Verify the Output matches the expected result
For comprehensive timestamp management, consider using the Parse Timestamp Processor which provides automatic format detection and validation.