Integrations

Cloud Setup

LogPulse integrates directly with your CI/CD and deployment platforms to collect audit logs, pipeline events, and deployment data. Set up cloud integrations to get full visibility into your DevOps workflows without installing any agents.

Overview

Cloud integrations connect LogPulse to your existing platforms via API polling and webhooks. Events are automatically normalized to standard data models, making them searchable with LPQL alongside your application logs.

3
Active Providers
2
Collection Methods
5+
Data Sources
Note: Cloud integrations collect platform-level events (builds, deploys, audits). For application logs from your own services, use the Vector Agent or HTTP API instead.

Supported Providers

LogPulse currently supports three cloud providers for direct integration. Each provider offers both polling and webhook-based data collection.

Azure DevOps

Audit logs, service hooks, and pipeline logs from Azure DevOps organizations.

GitHub Actions

Workflow runs, job results, and repository events from GitHub Actions.

Railway

Deployments, build logs, runtime logs, and deployment events from Railway.

Azure DevOps

Connect your Azure DevOps organization to collect audit trails, CI/CD pipeline logs, and service hook events. LogPulse polls the Azure DevOps REST API and receives real-time webhooks for immediate event delivery.

Configuration

To connect Azure DevOps, you'll need a Personal Access Token (PAT) with the appropriate scopes.

FieldRequiredDescription
Organization URLYesYour Azure DevOps organization URL (e.g., https://dev.azure.com/myorg)
Personal Access TokenYesA Personal Access Token with required scopes (see below)
Project NameNoLimit data collection to a specific project. If omitted, all projects are included.
Tip: Create a PAT with the minimum required scopes. Use a service account rather than a personal account for production integrations.

The following PAT scopes are required for full functionality:

Required PAT Scopes
• Audit Log: Read
• Build: Read
• Code: Read
• Project and Team: Read
• Service Connections: Read

Data Sources

Azure DevOps provides three data sources, each capturing different types of events:

Data SourceMethodEvents
Audit LogsPollPermission changes, PAT creation, git policy changes, group membership updates
Service HooksWebhookBuild completions, PR activity, code pushes, work item updates
Pipeline LogsPollPipeline run results, stage/job/task details, duration, status

GitHub Actions

Integrate GitHub Actions to monitor CI/CD workflow executions, job results, and repository events. LogPulse normalizes GitHub workflow data to the CICDPipelineEvent model for consistent querying.

Configuration

You need a GitHub Personal Access Token (classic or fine-grained) to connect your repositories.

FieldRequiredDescription
Personal Access TokenYesA GitHub Personal Access Token with read access to actions and contents
OwnerYesGitHub organization or username that owns the repositories
RepositoryNoSpecific repository to monitor. If omitted, all repositories under the owner are included.
Tip: Fine-grained tokens are recommended — they allow you to restrict access to specific repositories.

The token needs read-only access to the following permissions:

Required Token Permissions
• actions: read
• contents: read
• metadata: read

Data Sources

GitHub Actions provides three data sources for comprehensive CI/CD visibility:

Data SourceMethodEvents
Workflow RunsPollWorkflow execution start, completion, status, duration, trigger info
Workflow JobsPollIndividual job results, step details, runner information
Repository EventsWebhookPush events, PR creation/merge, releases, branch protection changes

Railway

Connect Railway to monitor deployments, build processes, and runtime logs from your Railway projects. LogPulse captures the full deployment lifecycle from build start to runtime.

Configuration

Generate an API token in your Railway account settings to connect your projects.

FieldRequiredDescription
API TokenYesRailway API token from your account settings
Project IDYesThe Railway project ID to monitor
Environment IDNoSpecific environment to monitor (e.g., production). If omitted, all environments are included.
Service IDNoSpecific service within the project. If omitted, all services are included.
Note: You can find your Project ID in the Railway dashboard under Project Settings → General.

Data Sources

Railway provides four data sources covering the complete deployment lifecycle:

Data SourceMethodEvents
DeploymentsPollDeployment creation, status changes, rollbacks, environment details
Build LogsPollBuild start, build logs, build success/failure, duration
Runtime LogsPollApplication stdout/stderr, crash logs, restart events
Deployment EventsWebhookReal-time deployment lifecycle events via webhook

Webhooks

Each integration provides a unique webhook URL for real-time event delivery. Configure the webhook URL in your provider's settings to receive events immediately instead of waiting for the next poll cycle.

Webhook URL Format
POST https://api.logpulse.io/api/v1/webhooks/cloud/:integrationId

After creating an integration, copy the webhook URL from the integration details and paste it into your provider's webhook configuration. LogPulse automatically validates incoming webhook payloads.

Note: Webhook endpoints validate the payload signature where supported (e.g., GitHub webhook secrets). Always configure webhook secrets in your provider for production integrations.

Polling & Sync

For data sources that don't support webhooks, LogPulse uses scheduled polling to fetch new events. The polling worker runs in the background and automatically handles pagination, deduplication, and retries.

SettingDescription
Poll IntervalHow often LogPulse checks for new events (configurable per integration, default: 5 minutes)
DeduplicationEvents are deduplicated by their unique provider ID to prevent duplicate entries
Retry LogicFailed poll attempts are retried with exponential backoff (max 3 attempts)
Tip: Reduce the poll interval for critical pipelines that need near-real-time visibility. For less critical data, a longer interval reduces API usage against your provider's rate limits.

Data Models

All cloud integration events are normalized to standardized data models before being stored. This ensures consistent field names and queryability across providers.

ModelUsed ByKey Fields
CICDPipelineEventGitHub Actions, Azure DevOpspipeline_name, status, trigger, duration_ms, branch, commit_sha
DeploymentEventRailwayservice_name, environment, status, deploy_type, image, duration_ms
AuditEventAzure DevOpsaction, actor, target, ip_address, timestamp, scope

Use the sourcetype field in LPQL to filter events by provider. Each provider sets a unique sourcetype (e.g., "github_actions", "azure_devops", "railway").

LPQL Example — Query CI/CD events
sourcetype="github_actions" status="failure" | stats count by workflow_name

Connection Validation

When you create or edit an integration, LogPulse runs a validation test to verify the connection before saving. This ensures your credentials are correct and the required permissions are in place.

The validation test checks:

  • URL format and accessibility
  • Authentication with provided credentials
  • Access to the configured data sources (e.g., audit log read access)
  • Specific project/repo access (if configured)
Warning: If validation fails, double-check your credentials and ensure the token has the required scopes. Expired or revoked tokens are the most common cause of validation failures.

Managing Integrations

After creating an integration, you can manage it from the Integrations page in the LogPulse dashboard. Each integration shows its status, last sync time, and event count.

ActionDescription
Pause / ResumeTemporarily stop data collection without deleting the integration configuration
Edit ConfigurationUpdate credentials, project scope, or poll interval. Re-validation runs automatically.
Delete IntegrationRemove the integration and stop all data collection. Previously collected data remains searchable.
View Sync LogsInspect recent sync history, errors, and event counts per poll cycle

Coming Soon

We're actively working on adding more cloud provider integrations. The following platforms are planned for upcoming releases:

Argo CD
Flux CD
GitLab CI
Tekton
Jenkins
Amazon EKS
Google GKE
Azure AKS
Falco
Trivy
HashiCorp Vault
Prometheus
Note: Have a provider you'd like to see supported? Contact us at [email protected] — we prioritize based on user demand.