Using AI Gateway in Workers
The @cloudflare/ai-gateway SDK provides a convenient way to use AI Gateway directly from Cloudflare Workers. This package simplifies authentication and routing by automatically handling gateway configuration and provider credentials.
Install the package using npm:
npm install @cloudflare/ai-gatewayThe SDK provides two main utilities:
The useAIGateway function configures your Worker to use AI Gateway. It automatically handles routing requests through your specified gateway.
import { useAIGateway } from "@cloudflare/ai-gateway";
export default { async fetch(request, env) { useAIGateway({ binding: env.AI, gateway: "your-gateway-name" });
// Your AI provider SDK calls will now be routed through AI Gateway },};binding: The AI binding from your Worker environment (typicallyenv.AI)gateway: The name of your AI Gateway
The ProviderAuth class provides methods for authenticating with AI providers. It supports three authentication modes:
Use credentials stored in Cloudflare's AI Gateway. This is the recommended approach for production use.
import { ProviderAuth } from "@cloudflare/ai-gateway";
const auth = ProviderAuth.storedKey();Learn more about storing keys in AI Gateway.
Use Cloudflare's AI Gateway billing to pay for and authenticate your inference requests.
import { ProviderAuth } from "@cloudflare/ai-gateway";
const auth = ProviderAuth.unifiedBilling();Learn more about Unified Billing.
Pass your provider's API key directly from environment variables. This is useful for development and testing.
const auth = env.OPENAI_API_KEY; // or any other provider's API keyHere's a complete example using the OpenAI SDK with AI Gateway in a Worker:
import OpenAI from "openai";import { useAIGateway, ProviderAuth } from "@cloudflare/ai-gateway";
export default { async fetch(request, env) { // Configure AI Gateway useAIGateway({ binding: env.AI, gateway: "my-gateway" });
// Choose your authentication method: // Option 1: Use stored keys (recommended for production) const auth = ProviderAuth.storedKey();
// Option 2: Use unified billing // const auth = ProviderAuth.unifiedBilling();
// Option 3: Use direct token from environment // const auth = env.OPENAI_API_KEY;
// Initialize the OpenAI client const openai = new OpenAI({ apiKey: auth, });
// Make your API call const response = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [ { role: "user", content: "What is Cloudflare?" } ], });
return Response.json(response); },};import Anthropic from "@anthropic-ai/sdk";import { useAIGateway, ProviderAuth } from "@cloudflare/ai-gateway";
export default { async fetch(request, env) { // Configure AI Gateway useAIGateway({ binding: env.AI, gateway: "my-gateway" });
// Choose your authentication method const auth = ProviderAuth.storedKey();
// Initialize the Anthropic client const anthropic = new Anthropic({ apiKey: auth, });
// Make your API call const message = await anthropic.messages.create({ model: "claude-sonnet-4-5", max_tokens: 1024, messages: [ { role: "user", content: "What is Cloudflare?" } ], });
return Response.json(message); },};To use the AI Gateway SDK in your Worker, you'll need to configure the AI binding in your wrangler.toml:
name = "my-worker"main = "src/index.ts"compatibility_date = "2024-01-01"
[ai]binding = "AI"For authentication using environment variables, add them to your wrangler.toml or set them using the Cloudflare dashboard:
[vars]OPENAI_API_KEY = "sk-..."Or use secrets for sensitive values:
wrangler secret put OPENAI_API_KEYThe SDK works seamlessly with multiple AI providers. Simply initialize each provider's SDK with the appropriate authentication:
import OpenAI from "openai";import Anthropic from "@anthropic-ai/sdk";import { useAIGateway, ProviderAuth } from "@cloudflare/ai-gateway";
export default { async fetch(request, env) { useAIGateway({ binding: env.AI, gateway: "my-gateway" });
const auth = ProviderAuth.storedKey();
const openai = new OpenAI({ apiKey: auth }); const anthropic = new Anthropic({ apiKey: auth });
// Use either provider based on your needs const openaiResponse = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "user", content: "Hello from OpenAI" }], });
const anthropicResponse = await anthropic.messages.create({ model: "claude-sonnet-4-5", max_tokens: 1024, messages: [{ role: "user", content: "Hello from Anthropic" }], });
return Response.json({ openai: openaiResponse, anthropic: anthropicResponse }); },};- Simplified Configuration: No need to manually construct AI Gateway URLs
- Automatic Routing: Requests are automatically routed through your configured gateway
- Built-in Authentication: Easy access to stored keys and unified billing
- Type Safety: TypeScript definitions for better development experience
- Consistent API: Works with any AI provider SDK that you choose
- Explore AI Gateway features like caching and rate limiting
- Learn about provider-specific endpoints
- Set up authentication for your gateway
- Configure dynamic routing for fallbacks and A/B testing
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Directory
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark
-