Integrations

Cloud Setup

LogPulse integrates directly with your CI/CD and deployment platforms to collect audit logs, pipeline events, and deployment data. Set up cloud integrations to get full visibility into your DevOps workflows without installing any agents.

Overview

Cloud integrations connect LogPulse to your existing platforms via API polling and webhooks. Events are automatically normalized to standard data models, making them searchable with LPQL alongside your application logs.

4
Active Providers
2
Collection Methods
5+
Data Sources
Note: Cloud integrations collect platform-level events (builds, deploys, audits). For application logs from your own services, use the Vector Agent or HTTP API instead.

Supported Providers

LogPulse currently supports four cloud providers for direct integration. Each provider offers both polling and webhook-based data collection.

Azure DevOps

Audit logs, service hooks, and pipeline logs from Azure DevOps organizations.

GitHub Actions

Workflow runs, job results, and repository events from GitHub Actions.

Railway

Deployments, build logs, runtime logs, and deployment events from Railway.

Cloudflare

Logs, traces, firewall events, audit logs, Pages deployments, and Zero Trust security logs from Cloudflare.

Azure DevOps

Connect your Azure DevOps organization to collect audit trails, CI/CD pipeline logs, and service hook events. LogPulse polls the Azure DevOps REST API and receives real-time webhooks for immediate event delivery.

Configuration

To connect Azure DevOps, you'll need a Personal Access Token (PAT) with the appropriate scopes.

FieldRequiredDescription
Organization URLYesYour Azure DevOps organization URL (e.g., https://dev.azure.com/myorg)
Personal Access TokenYesA Personal Access Token with required scopes (see below)
Project NameNoLimit data collection to a specific project. If omitted, all projects are included.
Tip: Create a PAT with the minimum required scopes. Use a service account rather than a personal account for production integrations.

The following PAT scopes are required for full functionality:

Required PAT Scopes
• Audit Log: Read
• Build: Read
• Code: Read
• Project and Team: Read
• Service Connections: Read

Data Sources

Azure DevOps provides three data sources, each capturing different types of events:

Data SourceMethodEvents
Audit LogsPollPermission changes, PAT creation, git policy changes, group membership updates
Service HooksWebhookBuild completions, PR activity, code pushes, work item updates
Pipeline LogsPollPipeline run results, stage/job/task details, duration, status

GitHub Actions

Integrate GitHub Actions to monitor CI/CD workflow executions, job results, and repository events. LogPulse normalizes GitHub workflow data to the CICDPipelineEvent model for consistent querying.

Configuration

You need a GitHub Personal Access Token (classic or fine-grained) to connect your repositories.

FieldRequiredDescription
Personal Access TokenYesA GitHub Personal Access Token with read access to actions and contents
OwnerYesGitHub organization or username that owns the repositories
RepositoryNoSpecific repository to monitor. If omitted, all repositories under the owner are included.
Tip: Fine-grained tokens are recommended — they allow you to restrict access to specific repositories.

The token needs read-only access to the following permissions:

Required Token Permissions
• actions: read
• contents: read
• metadata: read

Data Sources

GitHub Actions provides three data sources for comprehensive CI/CD visibility:

Data SourceMethodEvents
Workflow RunsPollWorkflow execution start, completion, status, duration, trigger info
Workflow JobsPollIndividual job results, step details, runner information
Repository EventsWebhookPush events, PR creation/merge, releases, branch protection changes

Railway

Connect Railway to monitor deployments, build processes, and runtime logs from your Railway projects. LogPulse captures the full deployment lifecycle from build start to runtime.

Configuration

Generate an API token in your Railway account settings to connect your projects.

FieldRequiredDescription
API TokenYesRailway API token from your account settings
Project IDYesThe Railway project ID to monitor
Environment IDNoSpecific environment to monitor (e.g., production). If omitted, all environments are included.
Service IDNoSpecific service within the project. If omitted, all services are included.
Note: You can find your Project ID in the Railway dashboard under Project Settings → General.

Data Sources

Railway provides four data sources covering the complete deployment lifecycle:

Data SourceMethodEvents
DeploymentsPollDeployment creation, status changes, rollbacks, environment details
Build LogsPollBuild start, build logs, build success/failure, duration
Runtime LogsPollApplication stdout/stderr, crash logs, restart events
Deployment EventsWebhookReal-time deployment lifecycle events via webhook

Cloudflare

Connect Cloudflare to collect logs, traces, firewall events, audit logs, Pages deployments, and more. Use OTLP push for real-time Workers & Pages logs and traces, or API polling for infrastructure data.

Edge HTTP request logs with RayID tracing
WAF/Firewall security event monitoring
Account audit trail for compliance (NIS2)
Workers trace events (poll + OTLP push)
Pages deployment tracking
Zero Trust Access & Gateway logs
DNS query analytics
Origin healthcheck monitoring
Note: OTLP push for Workers & Pages requires no Cloudflare credentials — just configure Cloudflare Observability to point at your LogPulse OTLP endpoint. An API key is auto-generated during setup.

Data Sources

Cloudflare provides ten data sources across security, observability, and deployment tracking. Sources use either API polling or OTLP push depending on the data type:

Data SourcePlanMethodDescription
HTTP Request LogsEnterprisePollEdge HTTP request logs from Cloudflare with RayID tracing.
DNS Query LogsEnterprisePollDNS query analytics from Cloudflare.
Firewall / WAF EventsFreePollWeb Application Firewall events including blocked requests, challenges, bot detections, and rate limiting triggers.
Account Audit LogsFreePollAccount-level audit trail from Cloudflare.
Zero Trust Access LogsZero TrustPollZero Trust Access authentication events.
Zero Trust Gateway LogsZero TrustPollZero Trust Gateway activity logs.
Workers & Pages LogsWorkers PaidPush (OTLP)Application logs from Cloudflare Workers and Pages via OTLP push.
Workers & Pages TracesWorkers PaidPush (OTLP)Cloudflare Workers and Pages execution traces via OTLP push.
Pages DeploymentsFreePollCloudflare Pages deployment events.
Healthcheck EventsProPollOrigin health monitoring from Cloudflare.

Data Models

Cloudflare events are normalized to five data models depending on the source type:

ModelSourcetypeSourcesDescription
Application Logcloudflare:workers:logWorkers & Pages LogsWorkers and Pages application logs streamed in real-time via OTLP push. Console output, exceptions, and custom events.
Security Eventsecurity_eventFirewall, Audit, Access, GatewayFirewall/WAF events, audit logs, Zero Trust Access and Gateway logs normalized with standardized security fields.
Trace Eventobservability_traceWorkers & Pages TracesWorkers and Pages execution traces and OTLP spans with service name, duration, status, and error details.
CI/CD Pipelinecicd_pipelinePages DeploymentsPages deployment events with build status, stages, git trigger info, and environment context.
Observability Alertobservability_alertHealthchecksHealthcheck status changes with origin address, failure reasons, and check regions.
LPQL Example — Query Workers logs
sourcetype="cloudflare:workers:log" level="error" | stats count by service_name
LPQL Example — Query WAF blocks
sourcetype="security_event" waf_action="block" | stats count by path, rule_id

Requirements

Requirements differ depending on whether you use OTLP push (Workers/Pages) or API polling (infrastructure data):

FieldRequiredDescription
OTLP PushWorkers/Pages onlyNo Cloudflare credentials needed — just configure Cloudflare Observability to push to your LogPulse OTLP endpoint.
API TokenAPI pollingCloudflare API Token with appropriate permissions for the data sources you want to poll.
Account IDAPI pollingYour Cloudflare Account ID (found in the dashboard sidebar).
Zone IDOptionalZone ID for zone-level data sources (HTTP logs, firewall, DNS, healthchecks). Optional if using account-level sources only.
Tip: Use a custom API Token instead of a Global API Key. Custom tokens can be scoped to specific zones and permissions for better security.
Required API Token Permissions
• Zone: Analytics: Read
• Zone: Firewall Services: Read
• Zone: Logs: Read (Enterprise only)
• Account: Account Settings: Read (for audit logs)
• Account: Zero Trust: Read (for Access & Gateway logs)
• Account: Cloudflare Pages: Read (for Pages deployments)

Webhooks

Each integration provides a unique webhook URL for real-time event delivery. Configure the webhook URL in your provider's settings to receive events immediately instead of waiting for the next poll cycle.

Webhook URL Format
POST https://api.logpulse.io/api/v1/webhooks/cloud/:integrationId

After creating an integration, copy the webhook URL from the integration details and paste it into your provider's webhook configuration. LogPulse automatically validates incoming webhook payloads.

Note: Webhook endpoints validate the payload signature where supported (e.g., GitHub webhook secrets). Always configure webhook secrets in your provider for production integrations.

Polling & Sync

For data sources that don't support webhooks, LogPulse uses scheduled polling to fetch new events. The polling worker runs in the background and automatically handles pagination, deduplication, and retries.

SettingDescription
Poll IntervalHow often LogPulse checks for new events (configurable per integration, default: 5 minutes)
DeduplicationEvents are deduplicated by their unique provider ID to prevent duplicate entries
Retry LogicFailed poll attempts are retried with exponential backoff (max 3 attempts)
Tip: Reduce the poll interval for critical pipelines that need near-real-time visibility. For less critical data, a longer interval reduces API usage against your provider's rate limits.

Data Models

All cloud integration events are normalized to standardized data models before being stored. This ensures consistent field names and queryability across providers.

ModelUsed ByKey Fields
CICDPipelineEventGitHub Actions, Azure DevOps, Cloudflare Pagespipeline_name, status, trigger, duration_ms, branch, commit_sha
DeploymentEventRailwayservice_name, environment, status, deploy_type, image, duration_ms
AuditEventAzure DevOpsaction, actor, target, ip_address, timestamp, scope
SecurityEventCloudflarewaf_action, rule_id, client_ip, path, bot_score, threat_score
ApplicationLogCloudflare Workersservice_name, level, message, timestamp, exception, worker_name
TraceEventCloudflare Workersservice_name, span_id, trace_id, duration_ms, status, error
ObservabilityAlertCloudflareorigin_address, status, failure_reason, check_region, latency_ms

Use the sourcetype field in LPQL to filter events by provider. Each provider sets a unique sourcetype (e.g., "github_actions", "azure_devops", "railway", "cloudflare:workers:log").

LPQL Example — Query CI/CD events
sourcetype="github_actions" status="failure" | stats count by workflow_name

Connection Validation

When you create or edit an integration, LogPulse runs a validation test to verify the connection before saving. This ensures your credentials are correct and the required permissions are in place.

The validation test checks:

  • URL format and accessibility
  • Authentication with provided credentials
  • Access to the configured data sources (e.g., audit log read access)
  • Specific project/repo access (if configured)
Warning: If validation fails, double-check your credentials and ensure the token has the required scopes. Expired or revoked tokens are the most common cause of validation failures.

Managing Integrations

After creating an integration, you can manage it from the Integrations page in the LogPulse dashboard. Each integration shows its status, last sync time, and event count.

ActionDescription
Pause / ResumeTemporarily stop data collection without deleting the integration configuration
Edit ConfigurationUpdate credentials, project scope, or poll interval. Re-validation runs automatically.
Delete IntegrationRemove the integration and stop all data collection. Previously collected data remains searchable.
View Sync LogsInspect recent sync history, errors, and event counts per poll cycle

Coming Soon

We're actively working on adding more cloud provider integrations. The following platforms are planned for upcoming releases:

Argo CD
Flux CD
GitLab CI
Tekton
Jenkins
Amazon EKS
Google GKE
Azure AKS
Falco
Trivy
HashiCorp Vault
Prometheus
Note: Have a provider you'd like to see supported? Contact us at [email protected] — we prioritize based on user demand.

We use cookies to analyze site traffic and improve your experience. No cookies are placed without your consent. Privacy Policy