Skip to Content
GuidesVercel AI SDK Integration

Vercel AI SDK Integration

Route all Vercel AI SDK calls through the Curate-Me gateway for automatic cost tracking, rate limiting, and PII scanning. The SDK’s createOpenAI provider accepts a custom baseURL — one property changed.

Before and after

// Before — direct to OpenAI import { createOpenAI } from '@ai-sdk/openai'; import { generateText } from 'ai'; const openai = createOpenAI({ apiKey: 'sk-your-openai-key', }); // After — through Curate-Me gateway import { createOpenAI } from '@ai-sdk/openai'; import { generateText } from 'ai'; const openai = createOpenAI({ apiKey: 'sk-your-openai-key', baseURL: 'https://api.curate-me.ai/v1/openai', // ← added headers: { 'X-CM-API-Key': 'cm_sk_YOUR_KEY' }, // ← added });

Every generateText, streamText, and generateObject call that uses this provider is now governed by the gateway.

Full example

import { createOpenAI } from '@ai-sdk/openai'; import { generateText, streamText } from 'ai'; const openai = createOpenAI({ apiKey: 'sk-your-openai-key', baseURL: 'https://api.curate-me.ai/v1/openai', headers: { 'X-CM-API-Key': 'cm_sk_YOUR_KEY', 'X-CM-Tags': 'project=vercel-ai-app,env=production', // optional cost tags }, }); // Non-streaming const { text, usage } = await generateText({ model: openai('gpt-4o-mini'), prompt: 'What is AI governance?', }); console.log(text); console.log(`Tokens: prompt=${usage.promptTokens}, completion=${usage.completionTokens}`); // Streaming const stream = streamText({ model: openai('gpt-4o-mini'), prompt: 'List three benefits of LLM cost tracking.', }); for await (const chunk of stream.textStream) { process.stdout.write(chunk); } // With system prompt and messages const { text: answer } = await generateText({ model: openai('gpt-4o-mini'), system: 'You are a concise technical advisor.', messages: [ { role: 'user', content: 'How does rate limiting protect AI apps?' }, ], }); console.log(answer);

Next.js route handler

// app/api/chat/route.ts import { createOpenAI } from '@ai-sdk/openai'; import { streamText } from 'ai'; const openai = createOpenAI({ apiKey: process.env.OPENAI_API_KEY!, baseURL: 'https://api.curate-me.ai/v1/openai', headers: { 'X-CM-API-Key': process.env.CM_API_KEY! }, }); export async function POST(req: Request) { const { messages } = await req.json(); const result = streamText({ model: openai('gpt-4o-mini'), messages, }); return result.toDataStreamResponse(); }

Prerequisites

npm install ai @ai-sdk/openai

What you get

Every Vercel AI SDK call through the gateway automatically receives:

FeatureDescription
Cost trackingPer-request token and dollar cost recorded in real time
Rate limitingPer-org, per-key request throttling
PII scanningRegex scan for secrets and PII before hitting the provider
Model allowlistsOnly approved models per your org policy
Budget capsDaily and monthly spend limits enforced
Audit trailFull request metadata logged for compliance

Working example

Next steps