Skip to content
Open-source • Apache 2.0 License

Reliable webhooks and background jobs at any scale

Open-source queue infrastructure for webhooks, background jobs, and workflows. Built-in retries, idempotency, and dead-letter queues—so failed work is visible, recoverable, and re-playable.

Star on GitHub
webhook-handler.ts
// Stripe webhook → Spooled → Your worker
const response = await fetch('https://api.spooled.cloud/api/v1/jobs', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    queue_name: 'stripe-events',
    payload: stripeEvent,
    idempotency_key: stripeEvent.id,
  })
});

// ✓ Queued instantly
// ✓ Retries on failure (exponential backoff)
// ✓ Deduplicated by idempotency key
// ✓ Live status in dashboard
Quick Start

Up and running in minutes

Enqueue a job, process it, done. It's that simple.

1
Enqueue a Job
// Your app: Send a job to the queue
const response = await fetch(
  'https://api.spooled.cloud/api/v1/jobs',
  {
    method: 'POST',
    headers: {
      'Authorization': `Bearer $${API_KEY}`,
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      queue_name: 'emails',
      payload: {
        to: 'user@example.com',
        template: 'welcome',
      },
    }),
  }
);

// Job is now queued! ✓
2
Process Jobs
// Your worker: Claim → process → complete/fail
const WORKER_ID = 'worker-1';

while (true) {
  const res = await fetch('.../api/v1/jobs/claim', {
    method: 'POST',
    headers: { /* auth */ },
    body: JSON.stringify({
      queue_name: 'emails',
      worker_id: WORKER_ID,
      limit: 5,
    }),
  });

  const { jobs } = await res.json();

  for (const job of jobs) {
    await processJob(job);
  }
}
Bonus: Stream live queue stats (SSE)
// Stream queue stats (pending/processing/completed) in near real time
curl -N https://api.spooled.cloud/api/v1/events/queues/emails \\
  -H "Authorization: Bearer sk_live_..."
How It Works

Simple API, powerful infrastructure

Get started in minutes. Queue jobs with a single API call and let Spooled handle retries, deduplication, and observability.

1

Queue a Job

Send jobs to Spooled via REST API. We store them reliably with deduplication and scheduling.

2

Your Workers Process

Your code (on your servers) polls for jobs, processes them, and reports success/failure. Failed jobs auto-retry.

3

Monitor Everything

Real-time dashboard shows all job states. Track success rates, latency, and failures.

1. Queue a job
# Queue a webhook event
curl -X POST https://api.spooled.cloud/api/v1/jobs \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{
    "queue_name": "stripe-webhooks",
    "payload": {
      "type": "invoice.paid",
      "customer_id": "cus_123"
    },
    "idempotency_key": "evt_abc123"
  }'
2. Process jobs
# Claim jobs
curl -X POST https://api.spooled.cloud/api/v1/jobs/claim \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{
    "queue_name": "stripe-webhooks",
    "worker_id": "worker-1",
    "limit": 5
  }'

# Complete a job (ack)
curl -X POST https://api.spooled.cloud/api/v1/jobs/job_xyz/complete \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{"worker_id":"worker-1"}'
Real-World Examples

See it in action

Copy-paste these examples into your project. Each one shows exactly how to integrate Spooled.

Spooled handles

  • • Reliable job queuing & storage
  • • Automatic retries with backoff
  • • Job deduplication (idempotency)
  • • Dead-letter queues
  • • Real-time job monitoring

You provide

  • • Workers (your code, your servers)
  • • Email service (Resend, SendGrid...)
  • • Storage (S3, Cloudflare R2...)
  • • Any external APIs you need
  • • Your business logic

🛒 E-Commerce Order Processing

Process Stripe payments reliably

When a customer completes checkout, Stripe sends a webhook. Spooled queues it; your worker processes the payment, sends the email, and updates inventory (with retries and deduplication).

💳
Stripe Webhook
📥
Queue Job
Process Payment
📧
Send Email

1 Queue the job

JavaScript
// Your Stripe webhook endpoint (runs on YOUR server)
app.post('/webhooks/stripe', async (req, res) => {
  const event = req.body;
  
  // Queue the job in Spooled (just stores the data)
  await fetch('https://api.spooled.cloud/api/v1/jobs', {
    method: 'POST',
    headers: {
      'Authorization': 'Bearer sk_live_...',
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({
      queue_name: 'payments',
      payload: {
        event_type: event.type,
        customer_id: event.data.object.customer,
        amount: event.data.object.amount,
        order_id: event.data.object.metadata.order_id
      },
      idempotency_key: event.id  // Prevents duplicate processing
    })
  });
  
  res.status(200).send('Queued');  // Respond fast to Stripe
});

2 Process in worker

JavaScript
// Your worker (runs on YOUR server - could be a cron, container, etc.)
const WORKER_ID = 'worker-1';

async function processPayments() {
  // 1. Claim jobs from Spooled
  const res = await fetch('https://api.spooled.cloud/api/v1/jobs/claim', {
    method: 'POST',
    headers: { 
      'Authorization': 'Bearer sk_live_...',
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({ queue_name: 'payments', worker_id: WORKER_ID, limit: 10 })
  });
  const { jobs } = await res.json();
  
  for (const job of jobs) {
    try {
      // 2. YOUR code does the actual work
      await db.orders.update(job.payload.order_id, { status: 'paid' });
      const customerEmail = await lookupCustomerEmail(job.payload.customer_id);
      await resend.emails.send({  // YOUR Resend/SendGrid account
        to: customerEmail,
        subject: 'Order Confirmed!'
      });
      
      // 3. Tell Spooled: success!
      await fetch(`https://api.spooled.cloud/api/v1/jobs/${job.id}/complete`, {
        method: 'POST',
        headers: { 
          'Authorization': 'Bearer sk_live_...',
          'Content-Type': 'application/json'
        },
        body: JSON.stringify({ worker_id: WORKER_ID })
      });
    } catch (err) {
      // Job will auto-retry later
      await fetch(`https://api.spooled.cloud/api/v1/jobs/${job.id}/fail`, {
        method: 'POST',
        headers: { 
          'Authorization': 'Bearer sk_live_...',
          'Content-Type': 'application/json'
        },
        body: JSON.stringify({ worker_id: WORKER_ID, error: err.message })
      });
    }
  }
}
Use Cases

Built for any async workload

From simple webhook handlers to complex event-driven architectures, Spooled scales with your needs.

Webhook Processing

Route webhooks from any HTTP source. Reduce failures with retries, idempotency keys, and a clear audit trail.

Example: Process payment events, deploy on push, handle SMS responses

Stripe webhooks GitHub webhooks Shopify Custom HTTP

Background Jobs

Offload slow operations like email sending, image processing, and report generation to background workers.

Example: Send welcome emails, generate invoices, resize images

email queue PDF generation video transcoding data exports

Scheduled Tasks

Schedule jobs for later execution with precise timing. Perfect for reminders, reports, and recurring tasks.

Example: Daily reports, subscription renewals, cleanup tasks

cron jobs scheduled tasks delayed execution recurring jobs

Event-Driven Workflows

Chain jobs together to build complex workflows. Each step can trigger the next with full error handling.

Example: Order processing, user onboarding, approval flows

workflow automation event sourcing saga pattern orchestration
Performance

Built for scale

100

Max retries (configurable)

1 h

Max job lease (visibility)

SSE

Live queue/job streams

JWT +

API keys / dashboard auth

Built for production workloads
Reliability

Automatic retries with exponential backoff

Failed jobs automatically retry with intelligent backoff. After all retries, jobs move to the Dead Letter Queue for inspection.

Initial Attempt

t = 0s

Job is picked up by worker for processing

1

Retry #1

t + 1m

First retry after 1 minute delay

2

Retry #2

t + 3m

Exponential backoff: 2 minute delay

3

Retry #3

t + 7m

Exponential backoff: 4 minute delay

Backoff Formula (timings shown above are real)
delay_minutes = min(2^retry_count, 60)

Default job retries are scheduled in minutes (1m, 2m, 4m, ...), capped at 60m. (Queue/job config can override max retries.)

Live Demo

Watch jobs flow through the queue

Spooled stores your jobs. Your workers claim and process them. Failed jobs auto-retry.

Incoming

Live
Waiting for webhooks...

Processing

0 active
No active jobs

Completed

0 total
Jobs will appear here
0
Jobs/sec
0ms
Avg latency
100%
Success
0
Retries
Real-time Streaming

Publish once, stream updates everywhere

Queue jobs via REST. Stream queue/job updates via SSE (and WebSocket for dashboards).

Publisher

Your Application

# Enqueue a job
POST
/api/v1/jobs
{
"queue_name": "orders",
"payload": {...}
}

Spooled

Queue & Store

Live
Click "Publish Event" to see it flow
Queue: orders 0 jobs processed

Stream Client

SSE: Queue stats

# Live event stream
GET
/api/v1/events/queues/orders
{
"event": "queue.stats",
"data": { pending, processing, ... }
}
queue-stream.ts
// Stream live queue stats via SSE
curl -N https://api.spooled.cloud/api/v1/events/queues/orders \\
  -H "Authorization: Bearer sk_live_..."

// Dashboards can also use WebSocket (JWT auth via query param):
// GET /api/v1/ws?token=JWT_ACCESS_TOKEN&queue=orders
Distributed Processing

Scale with your worker pool

Run multiple workers on your servers. Spooled distributes jobs automatically—no coordination needed.

You run: Workers on your infrastructure  •  Spooled provides: Queue coordination & job locking

Your Worker Pool

Your servers claim jobs from Spooled

3
Workers
0
Claimed
0
Queued
Your Server 1
Idle
Polling for jobs...
Your Server 2
Idle
Polling for jobs...
Your Server 3
Idle
Polling for jobs...
Job Queue
Cron Schedules

Recurring jobs made simple

Create jobs that run on a schedule. Daily reports, subscription renewals, cleanup tasks—set it once and forget it.

Create Recurring Schedule
// Run daily at 9 AM
await fetch('https://api.spooled.cloud/api/v1/schedules', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    name: 'daily-sales-report',
    cron_expression: '0 9 * * *',
    timezone: 'America/New_York',
    queue_name: 'reports',
    payload_template: {
      report_type: 'daily_sales',
      format: 'pdf'
    }
  })
});

// ✓ Runs automatically every day
// ✓ Retries on failure
// ✓ Timezone-aware
Schedule for Later
// Send reminder in 1 hour
await fetch('https://api.spooled.cloud/api/v1/jobs', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    queue_name: 'notifications',
    payload: {
      type: 'reminder',
      user_id: 123
    },
    scheduled_at: new Date(
      Date.now() + 3600000 // +1 hour
    ).toISOString()
  })
});

// ✓ Runs at exact time
// ✓ Perfect for reminders, trials, renewals

Common Cron Patterns

0 * * * * Every hour
0 9 * * * Daily at 9 AM
0 0 * * 0 Weekly on Sunday
0 0 1 * * Monthly on 1st
*/15 * * * * Every 15 minutes
0 9 * * 1-5 Weekdays at 9 AM
Workflows

Orchestrate complex workflows

Chain jobs together with dependencies. Build multi-step processes where each job waits for its dependencies to complete.

Create Account
Step 1
Send Welcome
Step 2 (waits for Step 1)
Setup Defaults
Step 3 (waits for Step 1)
User Onboarding Workflow
// Create a workflow with job dependencies
await fetch('https://api.spooled.cloud/api/v1/workflows', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    name: 'user-onboarding',
    jobs: [
      {
        name: 'create-account',
        queue_name: 'users',
        payload: {
          email: 'user@example.com',
          plan: 'pro'
        }
      },
      {
        name: 'send-welcome-email',
        queue_name: 'emails',
        depends_on: ['create-account'],  // Waits for this job
        payload: {
          template: 'welcome'
        }
      },
      {
        name: 'setup-defaults',
        queue_name: 'users',
        depends_on: ['create-account'],  // Also waits
        payload: {
          settings: {...}
        }
      }
    ]
  })
});

// ✓ Jobs run in dependency order
// ✓ If parent fails, children are cancelled
// ✓ Parallel execution where possible

Automatic Ordering

Jobs run in the correct order based on dependencies. No manual coordination needed.

Failure Handling

If a parent job fails, dependent children are automatically cancelled.

Parallel Execution

Independent jobs run in parallel for maximum throughput.

Job Priority

Process urgent jobs first

Higher priority jobs jump the queue. Perfect for VIP customers, critical alerts, and time-sensitive tasks.

Priority Levels
// High priority - processed first (VIP customer order)
await fetch('https://api.spooled.cloud/api/v1/jobs', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    queue_name: 'orders',
    payload: {
      order_id: 789,
      customer_tier: 'vip'
    },
    priority: 10  // High priority
  })
});

// Normal priority (default)
await createJob({
  queue_name: 'orders',
  payload: {...order},
  priority: 0  // Default
});

// Low priority - background cleanup
await createJob({
  queue_name: 'maintenance',
  payload: {...cleanup},
  priority: -10  // Low priority
});

// Workers claim jobs: High → Normal → Low

Processing Order

VIP Order
Priority: 10
↑ First
Normal
Priority: 0
↑ Second
Cleanup
Priority: -10
↑ Last
Outgoing Webhooks

Get notified when events occur

Spooled POSTs to your configured URLs when jobs complete, fail, or queues pause. Connect to Slack, Discord, your own app, or any webhook endpoint.

Configure Notifications
// Setup webhook for job events
await fetch('https://api.spooled.cloud/api/v1/outgoing-webhooks', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    name: 'Slack Notifications',
    url: 'https://hooks.slack.com/...',
    events: [
      'job.completed',
      'job.failed',
      'queue.paused'
    ],
    secret: 'your-hmac-secret'  // For signature verification
  })
});

// ✓ Spooled POSTs to your URL
// ✓ Automatic retries
// ✓ Delivery history in dashboard
Per-Job Completion Webhook
// Get notified when THIS job completes
await fetch('https://api.spooled.cloud/api/v1/jobs', {
  method: 'POST',
  headers: {
    'Authorization': `Bearer $${API_KEY}`,
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    queue_name: 'exports',
    payload: {
      report_id: 12345
    },
    completion_webhook: 'https://your-app.com/webhooks/export-done'
  })
});

// When job completes, Spooled POSTs to your URL:
// POST https://your-app.com/webhooks/export-done
// {status: "completed", job_id: "...", result: {...}}

Available Events

job.created
job.started
job.completed
job.failed
job.cancelled
queue.paused
queue.resumed
worker.registered
worker.deregistered
schedule.triggered
💬

Slack Alerts

Get notified in Slack when jobs fail

📊

Analytics

Send events to your analytics platform

🎯

Custom Logic

Trigger your own workflows

📧

Email Notifications

Alert teams about important events

Architecture

Production-ready from day one

Built with Rust for performance and reliability. PostgreSQL for durability. Redis for real-time pub/sub.

System Architecture

Multi-tenant queue with PostgreSQL RLS, Redis pub/sub, and real-time updates.

%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#ecfdf5', 'primaryTextColor': '#065f46', 'primaryBorderColor': '#10b981', 'lineColor': '#6b7280', 'secondaryColor': '#eff6ff', 'tertiaryColor': '#faf5ff', 'fontSize': '18px', 'fontFamily': 'Inter, system-ui, sans-serif'}}}%%
flowchart LR
  subgraph sources[" 📥 Webhook Sources "]
    GH["GitHub"]
    ST["Stripe"]
    CU["Custom HTTP"]
    HC["HTTP Clients"]
  end

  subgraph backendSvc[" ⚡ Spooled Backend "]
    API["REST API"]
    GRPC["gRPC (optional)"]
    RT["WebSocket/SSE"]
  end

  subgraph storage[" 💾 Data Plane "]
    PG[("PostgreSQL")]
    RD[("Redis")]
  end

  subgraph obs[" 📊 Observability "]
    PR["Prometheus /metrics"]
    GF["Grafana (optional)"]
  end

  subgraph dashboard[" 🖥️ Dashboard "]
    DB["Realtime UI"]
  end

  GH --> API
  ST --> API
  CU --> API
  HC --> API
  HC --> GRPC

  API --> PG
  GRPC --> PG
  RT --> RD
  PG --> DB
  RD --> DB
  API --> PR
  PR --> GF
Rust
Backend
PostgreSQL
Database
Redis
Pub/Sub
RLS
Multi-tenant
Universal Webhooks

Accept webhooks from any source

Receive HTTP POST requests from any service. No vendor lock-in, no special configuration.

Stripe

Payment events

GitHub

Repository events

Shopify

Order updates

Custom

Any HTTP POST

Simple HTTP Interface

Just POST to /api/v1/jobs with your payload. No SDKs required—works with cURL or any HTTP client. SDKs are planned.

View Docs →
Open Source

Built in the open,
for the community

Spooled is 100% open-source under the Apache 2.0 license. Inspect the code, contribute features, or deploy on your own infrastructure.

TypeScript Rust PostgreSQL Redis Docker Apache 2.0

No vendor lock-in

Self-host on your infrastructure or use our managed cloud. Your data, your control.

Transparent & auditable

Review the source code. Run security audits. Know exactly how your data is handled.

Community-driven

Built by developers, for developers. Contribute features, report bugs, shape the roadmap.

Ready to make your webhooks reliable?

Get started for free. No credit card required. Upgrade when you need more capacity.

No credit card required Free tier forever Open source