Stop building queues.
Start shipping features.
Publish, schedule, and deliver HTTP requests with one API call. Automatic retries, cron schedules, URL groups, and real-time visibility—no Redis, workers, or ops to maintain.
Built for teams that ship fast
Join hundreds of teams using Relay to deliver webhooks reliably without managing infrastructure.
"Relay replaced our entire background job infrastructure. What took us weeks to build and maintain now just works."
"The DLQ and replay features have saved us countless hours debugging webhook failures. Essential for any production app."
"We migrated from QStash for the better pricing and URL groups feature. Fan-out notifications are now trivial."
Shipping reliable
async work is hard.
Serverless teams end up rebuilding queues, retry logic, and monitoring for every product. Ops overhead grows while reliability still slips.
Lost messages
Webhook fails? Data gone. No durable queue, no retries, no DLQ for recovery when endpoints choke.
Infrastructure overhead
Redis clusters, cron workers, region failover, monitoring—weeks of plumbing and on-call toil instead of shipping features.
Serverless timeouts
Edge limits of 10–30s make background work brittle. Long-running tasks and fan-out patterns become a guessing game.
Zero visibility
When deliveries fail, you’re blind. No traceability, no signatures, and no easy replay to unblock customers.
await fetch("https://api.anlyon.com/publish", { method: "POST", body: JSON.stringify({ url: "https://your-api.com/webhook", body: { event: "order.created" } }) })
One API call.
Infinite reliability.
Anlyon Relay handles the complexity so you don't have to. Publish a message, and we'll make sure it gets delivered.
Automatic retries & DLQ
Exponential backoff, bounded retries, and durable DLQ so failed deliveries can be replayed safely.
Cron, delays, and scheduling
One-off, relative, and cron schedules without cron workers. Drift-free delivery across regions.
URL groups & signatures
Fan-out to multiple endpoints with per-target timeouts, HMAC signatures, and independent retry policies.
Observability & replay
Live analytics, searchable history, and safe replay from DLQ so teams debug fast without guesswork.
Works with your stack
Deploy anywhere. Anlyon Relay integrates seamlessly with Vercel, Cloudflare Workers, AWS Lambda, and any platform that can make HTTP requests.
Platforms
Frameworks
export default async (req: Request) => {
const { email, template } = await req.json();
// Queue email for reliable delivery
await fetch("https://api.anlyon.com/publish", {
method: "POST",
headers: { Authorization: `Bearer ${API_KEY}` },
body: JSON.stringify({
url: "https://api.sendgrid.com/mail",
body: { to: email, template },
retries: 5
})
});
return Response.json({ queued: true });
} Serverless messaging via HTTP
1 Publish a message
Make a POST request to our API with your destination URL and payload. Schedule it for later or send immediately.
2 We handle delivery
Automatic retries with exponential backoff. Failed messages go to your dead letter queue for inspection.
3 Monitor everything
Real-time analytics dashboard. Track delivery rates, latency, and failures. Debug issues instantly.
1 import Client from "@anlyon/sdk";
2
3 const client = new Client({'{'}
4 token: "<API_KEY>"
5 {'}'});
6
7 const response = await client.publish({'{'}
8 url: "https://api.myapp.com/webhook",
9 body: {'{'}
10 userId: 12345,
11 action: "welcome_email"
12 {'}'},
13 delay: 30 // seconds
14 {'}'});
Built-in reliability.
Zero infrastructure required.
Everything you expect from a production queue—delays, cron, DLQ, fan-out, observability—delivered over a simple HTTP API.
Declarative schedules & delays
Cron, one-off, and relative delays without running cron workers. Drift-free scheduling with automatic retry and DLQ baked in.
URL groups & fan-out
Broadcast to multiple endpoints with per-target signatures, timeouts, and retry policies. Perfect for webhooks and multi-tenant delivery.
Named queues & concurrency
Define queues with per-queue concurrency, ordering, and rate limits. Protect upstreams while keeping latency low.
Observability & replay
Search every message, inspect payloads, signatures, and headers, then replay safely from DLQ with one click.
Built for the edge.
Optimized for reliability.
Queue, schedule, and deliver webhooks without owning infrastructure—built for serverless velocity.
Email Sending
Offload slow SMTP transactions to background workers to keep your API snappy.
Email sent to customer@domain.com
- Automatic Retries
- Non-blocking
Image Processing
Resize and optimize user uploads asynchronously without blocking the UI.
- Scalable Workers
- Edge Compatible
Webhook Buffering
Ingest webhooks from Stripe or Twilio immediately, process them at your own pace.
- Rate Limiting
- No Data Loss
AI Job Queues
Manage expensive LLM inference tasks efficiently with concurrency controls.
- FIFO Ordering
- Long Polling
Simple, transparent pricing
Start free, scale as you grow.
No credit card required to start. Upgrade anytime.
Hobby
Perfect for side projects and prototypes.
- 50,000 messages / month
- 7 day message retention
- 3 queues
- Community support
- Basic analytics
Pro
For production applications scaling up.
- Pay as you go
- Unlimited message retention
- 99.99% Uptime SLA
- Priority support
- Advanced analytics
- URL groups & fan-out
Frequently asked questions
Everything you need to know about Anlyon Relay.
How is this different from QStash?
Do I need Redis or any infrastructure?
Can I use this with Vercel, Netlify, or Cloudflare?
What happens if my endpoint is down?
Can I use this with any programming language?
How secure is my data?
Still have questions?
Contact supportReady to ship faster?
Start for free with 50,000 messages per month. No credit card required. Scale as you grow.