Atlassian
Connectors Overview
Connectors enable AI teammates to fetch data, call tools, and interact with external systems. They serve as bridges between Edge Delta’s AI Team and your infrastructure, development tools, cloud platforms, and data sources. Configure connectors with credentials and settings, assign them to teammates (or they auto-assign to specialized teammates), then AI teammates use the connector tools to perform operations.
What Are Connectors?
A connector provides:
- Authentication: Secure connection to external services
- Tools: Specific capabilities (e.g., list resources, query data, create issues)
- Permissions: Read-only or read-write access controls
- Configuration: Service-specific settings (endpoints, regions, filters)
The AI Team operates across two fundamentally different data patterns, each requiring distinct connector architectures optimized for their operational characteristics. This dual architecture enables complementary operational modes: streaming connectors provide the comprehensive telemetry foundation that teammates query during investigations, while event connectors trigger those investigations at precisely the moments human attention is most valuable.
Event Connectors
Two-way, MCP-based connectors that enable AI teammates to interact with external platforms. Depending on the connector, they can receive events (GitHub webhooks, PagerDuty incidents) and send actions (create Jira tickets, query AWS APIs).
Event connectors deliver discrete, actionable signals that trigger autonomous teammate workflows: PagerDuty incidents requiring immediate investigation, GitHub pull requests awaiting code review, AWS security findings demanding compliance assessment, Slack mentions needing response. These connectors operate through Model Context Protocol (MCP) servers that expose both inbound event streams and outbound action capabilities—creating tickets, posting updates, querying APIs, modifying configurations.
When you configure your first event connector, Edge Delta automatically provisions a dedicated AI Team ingestion pipeline that routes events to OnCall AI. Ingestion pipelines are stateless and require no provisioning, providing instant creation and automatic scaling. Additional event connectors become inputs to this same pipeline, consolidating event-driven workflows through a unified orchestration layer. This architectural separation ensures event-triggered investigations receive priority routing without competing with high-volume telemetry streams, while maintaining consistent governance and audit trails across both patterns.
Note: Accounts created before the ingestion pipeline release may use cloud pipelines for AI Teammates connectors. These continue to function normally. To migrate to an ingestion pipeline, disconnect and reconnect your connectors.
Streaming Connectors
Data ingestion connectors that continuously stream telemetry data into Edge Delta Pipelines. They collect logs, metrics, traces, and events, making them available for AI teammates to query through the Edge Delta MCP connector. Streaming connectors are the same as sources in an environment (pipeline)—when you configure a streaming connector, you’re adding a source to a pipeline.
Streaming connectors handle continuous, high-volume telemetry flows—logs, metrics, traces, and events generated by applications and infrastructure. These connectors integrate with Edge Delta’s proven telemetry pipeline infrastructure where processors apply parsing, enrichment, masking, and routing before data reaches storage or downstream destinations. The same governance controls that protect production pipelines (RBAC, data masking, retention policies) apply uniformly to data that AI teammates access through the Edge Delta MCP connector. Organizations leverage existing pipeline investments while extending them with AI analysis capabilities.
Connector and Pipeline Relationship
When you configure a connector, it creates or attaches to a pipeline:
- Event connectors automatically provision an ingestion pipeline (or add to an existing one)
- Streaming connectors add a source to your selected pipeline
Note: Manage connectors through the AI Team interface rather than editing the underlying pipeline directly. Changes to connector-managed pipelines through the Pipeline Builder may cause configuration mismatches.
Connector-Level Permissions
Configure default approval settings in the connector’s Tools tab:
- Allow: Execute without approval (read-only operations)
- Ask Permission: Require approval (write operations)
You can restrict specific tools when assigning connectors to teammates. Navigate to AI Team → Teammates → Edit → Connectors section to enable/disable tools per teammate or set specific permissions for that teammate.
Webhooks and Event Listening
Event connectors receive events through webhooks—HTTP requests sent to a connector-specific URL with an authentication token.
GitHub listens for events by default once you authenticate. No additional webhook configuration is required in GitHub—Edge Delta automatically receives events for repositories the authenticated account can access.
Other connectors (PagerDuty, Atlassian, Sentry, etc.) require you to configure outgoing webhooks in the external tool:
- Copy the Webhook URL and Webhook Token from the connector configuration in Edge Delta
- Configure the external tool to send webhooks to this URL with the token as an
Authorization: Bearer <token>header - Select which event types to send (incidents, issues, alerts, etc.)
For example, in PagerDuty, use the Generic Webhooks (v3) integration to configure outgoing webhooks. See each connector’s documentation for specific setup instructions.
Without webhook configuration, AI teammates can query data from these connectors when prompted, but won’t respond to events automatically.
Common Questions
Why doesn’t my streaming connector show a connection to AI?
Streaming connectors send data to Edge Delta pipelines, not directly to AI teammates. AI teammates access this data through the Edge Delta MCP connector, which queries the Edge Delta backend. As long as your streaming data reaches Edge Delta, teammates can analyze it—no direct AI connection is needed.
I have both a pipeline and a streaming connector—is that redundant?
No. When you configure a streaming connector, you’re adding a source to a pipeline. The connector provides a simplified interface for configuring common sources (like Kubernetes logs), while the pipeline processes and routes the data. They work together.