Integrations

HTTP API

The LogPulse HTTP API lets you ingest and query log data programmatically. Send logs from any language or platform using simple REST calls with JSON payloads. The API supports single and batch ingestion, LPQL queries, and provides real-time quota tracking.

Overview

The LogPulse API is a RESTful JSON API. All requests and responses use JSON encoding. Authentication is done via Bearer token in the Authorization header.

REST
RESTful API
JSON
JSON encoding
10 MB
Max request body

Authentication

All API requests require a valid API key passed as a Bearer token in the Authorization header. Create API keys in the LogPulse dashboard under Integrations → HTTP API.

Authorization header
Authorization: Bearer YOUR_API_KEY

Each API key is scoped to your organization and can optionally be linked to an ETL pipeline for automatic processing. You can create multiple API keys for different applications or environments.

Warning: Keep your API keys secret. Never expose them in client-side code, public repositories, or logs. If a key is compromised, revoke it immediately from the dashboard and create a new one.

Base URL

https://api.logpulse.io

All API endpoints are relative to this base URL. The API is served over HTTPS only.

Ingest Logs

Send log events to LogPulse for indexing and analysis. The ingest endpoint accepts both single log objects and arrays of logs for batch ingestion.

Log Schema

Each log event follows a simple schema. All fields are optional with sensible defaults:

FieldTypeRequiredDefaultDescription
eventstring | objectNo""The log event content — a text string or structured object
levelstringNo"info"Log level: debug, info, warn, error, fatal
timestampstring | numberNonowEvent timestamp as ISO 8601 string or Unix timestamp. Auto-generated if omitted.
indexstringNo"main"Target index for the event (default: 'main')
sourcestringNo""Source identifier (e.g., service name, hostname)
sourcetypestringNo""Source type for categorization (e.g., 'application', 'nginx_access')
hoststringNo""Originating host name
attributesRecord<string, string>No{}Arbitrary key-value pairs for structured data (searchable via LPQL)

Single Log

POST/api/v1/logs

Send a single log event as a JSON object in the request body.

Request body
{
  "event": "User login successful",
  "level": "info",
  "timestamp": "2026-03-21T10:30:00.000Z",
  "index": "production",
  "source": "auth-service",
  "sourcetype": "application",
  "host": "web-01",
  "attributes": {
    "user_id": "usr_abc123",
    "ip_address": "192.168.1.100",
    "method": "POST",
    "path": "/api/auth/login"
  }
}
Response (200)
{
  "data": {
    "accepted": 1,
    "rejected": 0,
    "timestamp": "2026-03-21T10:30:00.000Z",
    "quotaStatus": {
      "plan": "starter",
      "limitMB": 1024,
      "usedMB": 42.5,
      "percent": 4.15,
      "blocked": false
    }
  }
}

Batch Ingest

POST/api/v1/logs

Send multiple log events at once as a JSON array. Batch ingestion is more efficient for high-volume logging — fewer HTTP requests and lower overhead.

Batch request body
[
  {
    "event": "GET /api/users 200 12ms",
    "level": "info",
    "source": "nginx",
    "sourcetype": "access_log",
    "attributes": { "status": "200", "duration_ms": "12" }
  },
  {
    "event": "Database connection timeout",
    "level": "error",
    "source": "api-server",
    "sourcetype": "application",
    "attributes": { "db": "postgres", "timeout_ms": "5000" }
  }
]
Note: Batch ingestion supports partial success. If some log events in a batch fail validation, the valid events are still ingested. The response indicates the total count of successfully ingested events.

Vector Ingest

POST/api/v1/ingest/vector

A dedicated endpoint optimized for the Vector log agent. It accepts the same JSON payload format but includes additional processing for Vector-specific metadata fields.

Tip: Use this endpoint with Vector's HTTP sink for optimal compatibility. See the Vector Agent documentation for full configuration details.

Query Logs

GET/api/v1/logs

Search and retrieve log events using LPQL queries. Results are returned in reverse chronological order by default.

ParameterTypeDescription
querystringLPQL query string (e.g., level="error" source="api")
levelstringdocs.httpApi.query.levelParam
sourcestringdocs.httpApi.query.sourceParam
sourcetypestringdocs.httpApi.query.sourcetypeParam
fromstringStart of time range (ISO 8601)
tostringEnd of time range (ISO 8601)
limitnumberMaximum number of results to return (default: 100, max: 10000)
offsetnumberNumber of results to skip for pagination
Example request
GET /api/v1/logs?query=timeout&level=error&source=api-server&from=2026-03-20T00:00:00Z&limit=100

Recent Logs

GET/api/v1/logs/recent

Quickly fetch the most recent log events without specifying a query. Useful for tail-like functionality and dashboard widgets.

Rate Limits

The API enforces rate limits to ensure fair usage and platform stability. Limits are applied per API key.

EndpointLimitWindow
/api/v1/logs (POST)10,000Per minute
/api/v1/ingest/vector10,000Per minute
/api/v1/logs (GET)100Per minute

Rate limit information is included in response headers:

Rate limit response headers
X-RateLimit-Limit: 10000
X-RateLimit-Remaining: 9542
X-RateLimit-Reset: 1711018860
Warning: When a rate limit is exceeded, the API returns a 429 status code. Implement exponential backoff in your client to handle rate limiting gracefully.

Quotas

Each organization has a daily data ingestion quota based on their plan tier. The quota is measured in megabytes of raw log data per day and resets at midnight UTC.

ThresholdBehavior
< 80%Normal operation — logs are ingested without restrictions
80–100%Warning threshold — a quota warning is included in API responses
100%Quota exceeded — new log ingestion is blocked until the next day

Every ingest response includes a quotaStatus object with usedMB, limitMB, and percentUsed fields so you can monitor your consumption.

Error Handling

The API uses standard HTTP status codes and returns structured JSON error responses with details about what went wrong.

Status CodeMeaningAction
400Bad Request — invalid JSON or schemaCheck the request body against the log schema. Ensure 'message' field is present.
401Unauthorized — invalid or missing API keyVerify the Authorization header contains a valid Bearer token.
413Payload Too Large — body exceeds 10 MBSplit the batch into smaller chunks (recommended: < 5 MB per request).
429Too Many Requests — rate limit exceededImplement exponential backoff. Check X-RateLimit-Reset header for retry timing.
500Internal Server ErrorRetry with backoff. If persistent, contact [email protected].
Error response format
{
  "error": "Validation failed",
  "code": "VALIDATION_ERROR",
  "message": "Invalid log entry format"
}

Code Examples

Here are complete examples for sending logs to LogPulse from popular languages and tools:

cURL

Send a single log
curl -X POST https://api.logpulse.io/api/v1/logs \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "event": "Payment processed successfully",
    "level": "info",
    "source": "payment-service",
    "attributes": {
      "amount": "49.99",
      "currency": "EUR",
      "order_id": "ord_xyz789"
    }
  }'

Python

python — requests
import requests

API_KEY = "YOUR_API_KEY"
BASE_URL = "https://api.logpulse.io"

# Send a batch of logs
logs = [
    {
        "event": "User signed up",
        "level": "info",
        "source": "auth-service",
        "attributes": {"user_id": "usr_001", "plan": "growth"}
    },
    {
        "event": "Welcome email sent",
        "level": "info",
        "source": "email-service",
        "attributes": {"user_id": "usr_001", "template": "welcome"}
    }
]

response = requests.post(
    f"{BASE_URL}/api/v1/logs",
    json=logs,
    headers={"Authorization": f"Bearer {API_KEY}"}
)

print(response.json())
# {"data": {"accepted": 2, "rejected": 0, "timestamp": "...", "quotaStatus": {...}}}

Node.js

node.js — fetch
const API_KEY = process.env.LOGPULSE_API_KEY;

async function sendLog(event, level = 'info', attributes = {}) {
  const response = await fetch('https://api.logpulse.io/api/v1/logs', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Authorization': `Bearer ${API_KEY}`,
    },
    body: JSON.stringify({
      event,
      level,
      source: 'my-node-app',
      sourcetype: 'application',
      attributes,
    }),
  });

  if (!response.ok) {
    throw new Error(`LogPulse API error: ${response.status}`);
  }

  return response.json();
}

// Usage
await sendLog('Order created', 'info', { order_id: 'ord_123' }); // attributes

Go

go
package main

import (
    "bytes"
    "encoding/json"
    "fmt"
    "net/http"
    "os"
)

type LogEntry struct {
    Event      string            `json:"event"`
    Level      string            `json:"level,omitempty"`
    Source     string            `json:"source,omitempty"`
    Sourcetype string           `json:"sourcetype,omitempty"`
    Attributes map[string]string `json:"attributes,omitempty"`
}

func SendLog(entry LogEntry) error {
    body, err := json.Marshal(entry)
    if err != nil {
        return err
    }

    req, err := http.NewRequest("POST",
        "https://api.logpulse.io/api/v1/logs",
        bytes.NewBuffer(body))
    if err != nil {
        return err
    }

    req.Header.Set("Content-Type", "application/json")
    req.Header.Set("Authorization",
        "Bearer "+os.Getenv("LOGPULSE_API_KEY"))

    resp, err := http.DefaultClient.Do(req)
    if err != nil {
        return err
    }
    defer resp.Body.Close()

    if resp.StatusCode != http.StatusOK {
        return fmt.Errorf("unexpected status: %d", resp.StatusCode)
    }
    return nil
}