Skip to Content
GuidesFramework Integrations

Framework Integrations

The Curate-Me gateway works with any framework that uses OpenAI-compatible or Anthropic-compatible APIs. Swap the base URL and every LLM call gets governance — no SDK changes.

LangChain / LangGraph

from langchain_openai import ChatOpenAI llm = ChatOpenAI( model="gpt-4o", base_url="https://api.curate-me.ai/v1/openai", # One line change default_headers={"X-CM-API-Key": "YOUR_GATEWAY_KEY"}, ) response = llm.invoke("Explain quantum computing")

Every chain and agent call now flows through the gateway with cost tracking, rate limiting, and PII scanning.

CrewAI

CrewAI reads the OPENAI_BASE_URL environment variable:

export OPENAI_BASE_URL=https://api.curate-me.ai/v1/openai export OPENAI_API_KEY=YOUR_PROVIDER_KEY export OPENAI_DEFAULT_HEADERS='{"X-CM-API-Key": "YOUR_GATEWAY_KEY"}'
from crewai import Agent, Task, Crew researcher = Agent(role="Researcher", goal="Find data", llm="gpt-4o") crew = Crew(agents=[researcher], tasks=[Task(description="Research AI trends")]) crew.kickoff() # Governed by Curate-Me

OpenAI Agents SDK

from openai import OpenAI client = OpenAI( base_url="https://api.curate-me.ai/v1/openai", # One line change api_key="YOUR_PROVIDER_KEY", default_headers={"X-CM-API-Key": "YOUR_GATEWAY_KEY"}, ) response = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": "Hello!"}], )

Claude / Anthropic SDK

import anthropic client = anthropic.Anthropic( base_url="https://api.curate-me.ai/v1/anthropic", # One line change api_key="YOUR_ANTHROPIC_KEY", default_headers={"X-CM-API-Key": "YOUR_GATEWAY_KEY"}, ) message = client.messages.create( model="claude-sonnet-4-5-20250929", max_tokens=1024, messages=[{"role": "user", "content": "Hello!"}], )

TypeScript / Node.js

import OpenAI from 'openai' const client = new OpenAI({ baseURL: 'https://api.curate-me.ai/v1/openai', apiKey: 'YOUR_PROVIDER_KEY', defaultHeaders: { 'X-CM-API-Key': 'YOUR_GATEWAY_KEY' }, }) const response = await client.chat.completions.create({ model: 'gpt-4o', messages: [{ role: 'user', content: 'Hello!' }], })

Verifying Governance

After making a request, check that governance is active:

# Check your dashboard for the request curl -s https://api.curate-me.ai/v1/models \ -H "X-CM-API-Key: YOUR_GATEWAY_KEY" | jq '.data | length' # Should return the number of available models

Visit dashboard.curate-me.ai/costs  to see the request logged with cost, model, and governance status.

What You Get

Every request through the gateway automatically receives:

FeatureWhat happens
Cost trackingRequest cost recorded in real-time (Redis + MongoDB)
Rate limitingPer-org, per-key request throttling
PII scanningRegex scan for secrets/PII before hitting the provider
Model allowlistOnly approved models per your org policy
Budget capsDaily/monthly spend limits enforced
HITL approvalHigh-cost requests flagged for human review
Audit trailEvery request logged with full metadata