AI Team Overview

Explore Edge Delta’s AI Team: specialized AI teammates and custom teammates that work alongside your team to streamline observability, security, and infrastructure management.

Overview

Edge Delta’s AI Team forms a coordinated layer of intelligence across observability, security, and infrastructure operations. Each teammate carries a clearly defined role, so you can hand off routine investigations or complex incidents without losing context. Out of the box you gain immediate access to specialists, and you can introduce custom-built teammates as your priorities evolve.

AI Team Chat interface showing channels, direct messages, and conversation area

Together, these teammates study telemetry, surface anomalies, draft remediation steps, review code, and keep stakeholders aligned. Because the AI Team is deeply linked to Edge Delta Telemetry Pipelines and a broad connector ecosystem, the teammates work directly with the same tools, data streams, and workflows that already power your environment.

How the AI Team Works

The AI Team meets you wherever work happens. Shared channels keep specialized teammates and human teammates focused on topics such as alerts-feed, incident-response, platform-ops, code-issues, and security-events, and you can spin up your own channels for bespoke projects. One-to-one conversations happen in direct messages for read-only information gathering—use them to query teammates for insights and analysis. When teammates need to make changes to your infrastructure (e.g., using MCP to update configurations), those actions must happen in channels to provide organization-wide visibility. Overseeing every exchange is OnCall AI, which listens to the request, delegates tasks to the right specialist, and returns an integrated summary so you always know what happened next.

Core Components

OnCall AI

OnCall AI is the front door to the entire team. Ask it about the state of your telemetry pipelines, an incident timeline, or a deployment risk, and it will collect what it can, loop in the right specialists, and share a concise synthesis. When a request spans multiple steps—such as diagnosing an alert, validating remediation, and notifying stakeholders—OnCall AI keeps the thread organized for you.

Specialized Teammates

Several specialized teammates ship with the platform so you can cover key operational domains on day one. The Cloud Engineer tracks infrastructure health, spending, quotas, and scaling decisions. A DevOps Engineer keeps Kubernetes, container, and CI/CD workflows in line, while the SRE teammate accelerates detection, triage, paging, and follow-up. Issue Coordinator watches GitHub and project trackers to make sure work stays synchronized, and the Code Analyzer flags risky pull requests or missing tests before they become regressions. Security Engineer maintains posture across services such as AWS, GitHub, Jira, and PagerDuty. All of them collaborate seamlessly with OnCall AI to funnel their findings into a single narrative.

Teammates tab showing built-in specialized teammates and custom teammates

Learn more about specialized teammates

Custom Teammates

Pre-built teammates cover common needs, but many teams tailor AI teammates to the shape of their environment. You can define the system prompt that guides behavior, limit or expand connector access, seed conversations with helpful starters, and schedule periodic checks that deliver proactive updates. When a use case calls for a distinct foundation model, attach the one that best reflects the expertise you need.

Learn more about creating teammates

Connectors

Connectors give the AI Team secure, purposeful access to the systems you already rely on. More than forty integrations are available, spanning cloud platforms like AWS, Azure, and Google Cloud; development tooling such as GitHub, Jenkins, and CircleCI; incident and project management with PagerDuty, Jira, and Linear; collaboration in Slack or Microsoft Teams; and data platforms ranging from Databricks to Kafka and Pub/Sub. The team can also operate through Edge Delta MCP and custom remote MCP servers to monitor Kubernetes, Docker, and other infrastructure surfaces.

Connectors tab showing available integrations for AI Team

Connector Types

AI Team uses two types of connectors:

  • Event Connectors: Provide event-driven data to AI teammates (e.g., AWS EventBridge, PagerDuty, GitHub webhooks). When you configure your first event connector, Edge Delta automatically creates a special cloud pipeline that routes data to OnCall AI.

  • Streaming Connectors: Provide continuous telemetry streams (logs, metrics, traces). These connectors integrate with your existing Edge Delta pipelines, allowing you to control what data AI teammates can access using standard pipeline processors.

How Event Connectors Create Pipelines

When you configure your first event connector, Edge Delta automatically:

  1. Creates a special cloud pipeline dedicated to AI Team
  2. Adds the event connector as an input to this pipeline
  3. Routes all data to an OnCall AI teammate destination

When you configure additional event connectors, they are added as new inputs to the same AI Team cloud pipeline, all routing to the OnCall AI destination.

Event connector pipeline showing PagerDuty routing to the AI teammates destination in the automatically created OnCall AI Connector cloud pipeline

Pipeline Control: Like any Edge Delta pipeline, you can add processors to filter, transform, or enrich data before it reaches the AI teammates. This gives you precise control over what information AI Team can access and analyze.

How Streaming Connectors Work

When you configure a streaming connector, you’ll be prompted to select which pipeline should receive the data:

  • Existing Pipeline: Add the streaming connector to an existing pipeline as a new input
  • New Pipeline: If you don’t have an existing pipeline, Edge Delta shows the standard pipeline installation flow (same as clicking “New Pipeline” on the Pipelines page)

AI teammates access telemetry data from streaming connectors through the Edge Delta MCP connector. When you add a streaming connector, the data flows into your Edge Delta pipeline where it is processed and stored. AI teammates then query this data from the Edge Delta backend for analysis and investigation.

Connector Assignment

When you configure an event connector, it is automatically assigned to relevant built-in specialized teammates based on the connector type. For example, the PagerDuty connector is automatically assigned to the SRE and DevOps Engineer teammates.

Streaming connectors are not assigned to teammates. Instead, AI teammates access streaming telemetry data through the Edge Delta MCP Connector.

You can optionally modify event connector assignments through the teammate configuration interface. For custom teammates you create, you must manually assign the event connectors they should have access to.

Learn more about connectors

Use Cases

Teams rely on the AI Team to stay ahead of the unexpected. During an incident, teammates correlate metrics and logs, assemble timelines, suggest mitigations, and keep a record of actions taken. Outside of emergencies they watch resource usage for cost or capacity concerns, keep security policy drift in check, and champion code quality by reviewing pull requests before they reach human reviewers.

Because every teammate understands the workflows that tie your tools together, routine coordination noticeably improves. A teammate can open a ticket, update a deployment plan, share the summary in Slack, and archive the context for audits without you juggling multiple interfaces. The same connective tissue also powers observability insights, so the team can surface emerging trends, highlight negative sentiment, or draft dashboards that explain what changed.

Getting Started

When you are ready to bring the AI Team into daily operations, start with the Getting Started Guide for channel creation, teammate configuration, and connector setup.