Skip to content
Cloudflare Docs

Using AI Gateway in Workers

The @cloudflare/ai-gateway SDK provides a convenient way to use AI Gateway directly from Cloudflare Workers. This package simplifies authentication and routing by automatically handling gateway configuration and provider credentials.

Installation

Install the package using npm:

Terminal window
npm install @cloudflare/ai-gateway

Basic Usage

The SDK provides two main utilities:

useAIGateway

The useAIGateway function configures your Worker to use AI Gateway. It automatically handles routing requests through your specified gateway.

TypeScript
import { useAIGateway } from "@cloudflare/ai-gateway";
export default {
async fetch(request, env) {
useAIGateway({ binding: env.AI, gateway: "your-gateway-name" });
// Your AI provider SDK calls will now be routed through AI Gateway
},
};

Parameters

  • binding: The AI binding from your Worker environment (typically env.AI)
  • gateway: The name of your AI Gateway

ProviderAuth

The ProviderAuth class provides methods for authenticating with AI providers. It supports three authentication modes:

1. Stored Keys (BYOK - Bring Your Own Keys)

Use credentials stored in Cloudflare's AI Gateway. This is the recommended approach for production use.

TypeScript
import { ProviderAuth } from "@cloudflare/ai-gateway";
const auth = ProviderAuth.storedKey();

Learn more about storing keys in AI Gateway.

2. Unified Billing

Use Cloudflare's AI Gateway billing to pay for and authenticate your inference requests.

TypeScript
import { ProviderAuth } from "@cloudflare/ai-gateway";
const auth = ProviderAuth.unifiedBilling();

Learn more about Unified Billing.

3. Direct Authorization Token

Pass your provider's API key directly from environment variables. This is useful for development and testing.

TypeScript
const auth = env.OPENAI_API_KEY; // or any other provider's API key

Complete Example

Here's a complete example using the OpenAI SDK with AI Gateway in a Worker:

TypeScript
import OpenAI from "openai";
import { useAIGateway, ProviderAuth } from "@cloudflare/ai-gateway";
export default {
async fetch(request, env) {
// Configure AI Gateway
useAIGateway({ binding: env.AI, gateway: "my-gateway" });
// Choose your authentication method:
// Option 1: Use stored keys (recommended for production)
const auth = ProviderAuth.storedKey();
// Option 2: Use unified billing
// const auth = ProviderAuth.unifiedBilling();
// Option 3: Use direct token from environment
// const auth = env.OPENAI_API_KEY;
// Initialize the OpenAI client
const openai = new OpenAI({
apiKey: auth,
});
// Make your API call
const response = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [
{ role: "user", content: "What is Cloudflare?" }
],
});
return Response.json(response);
},
};

Environment Setup

To use the AI Gateway SDK in your Worker, you'll need to configure the AI binding in your wrangler.toml:

name = "my-worker"
main = "src/index.ts"
compatibility_date = "2024-01-01"
[ai]
binding = "AI"

For authentication using environment variables, add them to your wrangler.toml or set them using the Cloudflare dashboard:

[vars]
OPENAI_API_KEY = "sk-..."

Or use secrets for sensitive values:

Terminal window
wrangler secret put OPENAI_API_KEY

Using with Multiple Providers

The SDK works seamlessly with multiple AI providers. Simply initialize each provider's SDK with the appropriate authentication:

TypeScript
import OpenAI from "openai";
import Anthropic from "@anthropic-ai/sdk";
import { useAIGateway, ProviderAuth } from "@cloudflare/ai-gateway";
export default {
async fetch(request, env) {
useAIGateway({ binding: env.AI, gateway: "my-gateway" });
const auth = ProviderAuth.storedKey();
const openai = new OpenAI({ apiKey: auth });
const anthropic = new Anthropic({ apiKey: auth });
// Use either provider based on your needs
const openaiResponse = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "user", content: "Hello from OpenAI" }],
});
const anthropicResponse = await anthropic.messages.create({
model: "claude-sonnet-4-5",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello from Anthropic" }],
});
return Response.json({ openai: openaiResponse, anthropic: anthropicResponse });
},
};

Benefits of Using the SDK

  1. Simplified Configuration: No need to manually construct AI Gateway URLs
  2. Automatic Routing: Requests are automatically routed through your configured gateway
  3. Built-in Authentication: Easy access to stored keys and unified billing
  4. Type Safety: TypeScript definitions for better development experience
  5. Consistent API: Works with any AI provider SDK that you choose

Next Steps