Add a Pack to a Pipeline
3 minute read
Overview
Recap
You can create a segment of pipeline and save it as a single pipeline object called a pack.
There are two types of packs:
- Organization Packs that you or users in your organization create. These are available in the packs table to anyone in your organization if they have the required permissions.
- Packs in the Packs Library created by Edge Delta for specific use cases. These are available on the Knowledge tab for all organizations.
See Creating an Edge Delta Pack.
Use a Pack
Open a pipeline in which you want to add an existing pack:
- Click Pipelines.
- Select the fleet you want to add the pack to and click View/Edit Pipeline.
- Click Edit Mode.
- Click Add Processor, expand Packs and select the pack you want to add.
- Optionally, update the name.
- Optionally, click Edit Node to make changes to the contents of this pack instance.
Adding a pack adds an instance of it with its own name. If you make changes to the pack instance in the pipeline, the original pack saved in the pack table will not be updated. Similarly, if the original pack is updated in the pack editor, it will not change instances of that pack that are saved in any pipelines. If a pack is updated in the pack editor and you want it in your pipeline, you need to update the pipeline.
- Click OK.
Connect the pack’s input and its outputs to the upstream and downstream nodes. These steps are taken in the following example:
- Delete an existing link where the pack will reside.
- Connect the Input to the pack.
- Connect the
Processed
output path of the pack to themask_ssn
node. - Connect the
Passthrough
output path of the pack to theRaw_storage
destination node.
This configuration will add the pack’s logic to the start of an existing pipeline. It will also output unprocessed logs to a separate destination node.
- Click Review Changes.
- Click Save Changes.
The pipeline will be saved with the new pack. If the pipeline has been used to install an agent, that agent’s configuration will be updated to include the new pack.
Test a Pack
- Click Pipelines.
- Select the fleet containing the pack you want to test and click View/Edit Pipeline.
- Click Edit Mode.
- Open the pack.
- Paste a sample log in the Paste log data field.
- Select Test this node in isolation.
- Click Process Samples.
- Click Path: Processed
In this example the following log sample is tested against the pack:
{"timestamp": "2024-02-21T18:32:49.619636Z", "node_id": "node11", "event":"exit", "logLevel": "TRACE", "request": {"method": "DELETE", "endpoint": "/api/v3/users/88", "headers": {"Host": "api1.myapp.com", "Authorization": "+psQPgKcS", "Accept": "application/json"}}, "response": {"status": 404, "time_ms": 271, "body": {"id": 59, "name": "user244", "email": "user934@example.com"}}, "userId": 5, "message": "Exiting from method processPaymentTransaction after 20ms"}
Note the event field in the JSON body and the presence of an email address. In the Outgoing Data Items section, in the Processed path, the event field has been added as an attribute and the email address has been obfuscated. Notice there is also a result in the Passthrough path, meaning the log has been duplicated and the duplicate has also been sent to an output but without processing.