Quickstart
Get up and running with Spooled Cloud in 5 minutes. This guide covers the basics of queuing jobs, processing them with workers, and handling failures gracefully.
SDK Status
The Node.js, Python, and Go SDKs are under active development. The code examples below show the planned API. For production use today, we recommend the cURL / HTTP API examples. See SDKs for current status.
API Options
Spooled Cloud provides two APIs:
- REST API (port
8080) — HTTP/1.1 + JSON, best for web apps and simple integrations - gRPC API (port
50051) — HTTP/2 + Protobuf, best for high-throughput workers with streaming
This quickstart covers the REST API. See the gRPC documentation for high-performance worker implementations using gRPC streaming.
Prerequisites
Before you begin, you'll need:
- A Spooled Cloud account — Sign up for free
- An API key — Available in your dashboard
- cURL, Node.js, Python, or Go — Any HTTP client will work
Step 1: Queue Your First Job
Jobs are the fundamental unit in Spooled. A job represents a task to be processed asynchronously, like sending an email, processing an image, or delivering a webhook.
Using cURL
curl -X POST https://api.spooled.cloud/api/v1/jobs \
-H "Authorization: Bearer sk_live_YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"queue": "my-queue",
"payload": {
"event": "user.created",
"user_id": "usr_123",
"email": "alice@example.com"
},
"idempotency_key": "user-created-usr_123"
}'
The response includes the job ID and status. The idempotency_key prevents duplicate
processing if you retry the request.
Using Node.js
First, install the SDK:
npm install @spooled/sdk Then queue a job:
import { SpooledClient } from '@spooled/sdk';
const client = new SpooledClient({
apiKey: process.env.SPOOLED_API_KEY!,
});
// Queue a job
const job = await client.jobs.enqueue({
queue: 'email-notifications',
payload: {
to: 'user@example.com',
subject: 'Welcome!',
template: 'welcome',
},
idempotencyKey: `welcome-${userId}`,
maxRetries: 5,
});
console.log(`Queued job: ${job.id}`); Using Python
pip install spooled-sdk from spooled import SpooledClient
import os
client = SpooledClient(api_key=os.environ["SPOOLED_API_KEY"])
# Queue a background job
job = client.jobs.enqueue(
queue="image-processing",
payload={
"image_url": "https://example.com/image.jpg",
"operations": ["resize", "compress"],
"output_format": "webp"
},
idempotency_key=f"process-image-{image_id}",
max_retries=3
)
print(f"Queued job: {job.id}") Using Go
go get github.com/spooled-cloud/spooled-go package main
import (
"context"
"log"
"os"
"github.com/spooled-cloud/spooled-go"
)
func main() {
client := spooled.NewClient(os.Getenv("SPOOLED_API_KEY"))
job, err := client.Jobs.Enqueue(context.Background(), &spooled.EnqueueParams{
Queue: "webhook-delivery",
Payload: map[string]interface{}{
"url": "https://customer.example.com/webhook",
"event": "order.completed",
"payload": orderData,
},
IdempotencyKey: "order-webhook-" + orderId,
MaxRetries: 5,
})
if err != nil {
log.Fatal(err)
}
log.Printf("Queued job: %s", job.ID)
} Step 2: Process Jobs with a Worker
Workers claim jobs from queues and process them. If processing fails, Spooled automatically retries with exponential backoff.
Simple Worker Pattern (cURL)
Claim jobs:
# Claim up to 5 jobs
curl -X POST https://api.spooled.cloud/api/v1/jobs/claim \
-H "Authorization: Bearer sk_live_YOUR_API_KEY" \
-d '{"queue": "my-queue", "limit": 5}' Complete or fail jobs:
# Complete a job
curl -X POST https://api.spooled.cloud/api/v1/jobs/job_xyz123/complete \
-H "Authorization: Bearer sk_live_YOUR_API_KEY"
# Or fail it (will retry)
curl -X POST https://api.spooled.cloud/api/v1/jobs/job_xyz123/fail \
-H "Authorization: Bearer sk_live_YOUR_API_KEY" \
-d '{"reason": "Connection timeout"}' Node.js Worker
import { SpooledWorker } from '@spooled/sdk';
const worker = new SpooledWorker({
apiKey: process.env.SPOOLED_API_KEY!,
queue: 'email-notifications',
concurrency: 10,
});
worker.on('job', async (job) => {
const { to, subject, template } = job.payload;
try {
await sendEmail({ to, subject, template });
await job.complete();
console.log(`Sent email to ${to}`);
} catch (error) {
await job.fail({ reason: error.message });
}
});
worker.start();
console.log('Worker started, listening for jobs...'); Python Worker
from spooled import SpooledWorker
import os
worker = SpooledWorker(
api_key=os.environ["SPOOLED_API_KEY"],
queue="image-processing"
)
@worker.job_handler
async def process_image(job):
image_url = job.payload["image_url"]
operations = job.payload["operations"]
# Process the image
result = await run_image_pipeline(image_url, operations)
# Store the result
await job.complete(result={"output_url": result.url})
worker.run() Key Concepts
Idempotency
Use idempotency_key to prevent
duplicate processing. If a job with the same key exists, the request returns the existing job.
Automatic Retries
Failed jobs retry automatically with exponential backoff. Configure max retries and backoff curves per job or per queue.
Dead-Letter Queue
Jobs that exhaust retries move to the DLQ. Inspect failures and replay them with a single API call.
Real-time Updates
Stream job updates via WebSocket or SSE. Get instant notifications when jobs complete or fail.
Next Steps
- Learn about jobs and queues — Deep dive into job lifecycle
- Configure retry behavior — Customize backoff strategies
- Set up webhook ingestion — Accept webhooks from external services
- Build production workers — Best practices for reliable processing
- Explore the SDKs — Node.js, Python, and Go clients