Why Vercel AI SDK + LetsPing is a good fit
Vercel AI SDK makes it easy to build streaming, tool-using agents as route handlers. LetsPing adds a behavioral firewall and human-in-the-loop approvals so you keep the simple streamText interface and gain precise control over risky tools. If an agent hallucinates a destructive payload, Vercel Queues will deliver it—LetsPing intercepts before it touches your infra. Don't put an autonomous LLM payload into a queue without wrapping it in a deterministic firewall first.
Step 1 — Install adapters and SDK
npm install @letsping/sdk @letsping/adapters zod
The @letsping/adapters/vercel package exposes a letsPing helper that wraps tools with HITL and firewall logic.
Step 2 — Wrap risky tools
Inside your route handler, wrap any tool that should require human approval:
// app/api/chat/route.ts
import { NextRequest } from "next/server";
import { streamText } from "ai";
import { letsPing } from "@letsping/adapters/vercel";
import { z } from "zod";
const refundUser = letsPing({
name: "refund_user",
description: "Issues a refund. Requires human approval before execution.",
apiKey: process.env.LETSPING_API_KEY!,
priority: "high",
schema: z.object({
user_id: z.string(),
amount: z.number().positive(),
}),
});
export async function POST(req: NextRequest) {
const { messages } = await req.json();
const result = await streamText({
model: /* your model */,
messages,
tools: { refund_user: refundUser },
});
return result.toAIStreamResponse();
}When the model calls refund_user, LetsPing pauses execution, parks state, and sends the request to the HITL console governed by the behavioral firewall.
Step 3 — Show triage status in the client
When used with streaming tool progress, the adapter can emit a progress event so your UI can show “Waiting for approval” instead of hanging:
In React, listen to the tool progress stream and render a badge when status is something like "intercepted_by_firewall" with a triage_url linking to the LetsPing dashboard.
Agents in background queues: push is your safety net
When your agent runs inside a Vercel Queue (or Workflow), there's no frontend for the dev to watch. LetsPing intercepting that background job and pinging your phone is the safety net. Same firewall, same HITL console—you get the payload and live graph wherever you are.
How the behavioral firewall sees your Vercel agents
Each wrapped tool call becomes a LetsPing request with a service and action. The Markov-based behavioral firewall watches the sequence of calls over time and trips guardrails (velocity, loops, semantic loop, media, etc.) when something deviates from normal. Those events drop straight into the same HITL console your team uses for LangGraph and other stacks.
Next steps
• Explore the behavioral firewall and HITL console overviews.
• See the examples page for real Vercel AI SDK recipes.
• If multiple agents or vendors touch the same Vercel route, consider agent-to-agent escrow to keep a cryptographic chain-of-custody.