Skip to content

AI SDK

Workers AI can be used with the AI SDK for JavaScript and TypeScript codebases.

Setup

Install the workers-ai-provider provider:

Terminal window
npm install workers-ai-provider

Then, add an AI binding in your Workers project wrangler.toml file:

[ai]
binding = "AI"

Models

The AI SDK can be configured to work with any AI model.

import { createWorkersAI } from 'workers-ai-provider';
const workersai = createWorkersAI({ binding: env.AI });
// Choose any model: https://developers.cloudflare.com/workers-ai/models/
const model = workersai('@cf/meta/llama-3.1-8b-instruct', {});

Generate Text

Once you have selected your model, you can generate text from a given prompt.

import { createWorkersAI } from 'workers-ai-provider';
import { generateText } from 'ai';
type Env = {
AI: Ai;
};
export default {
async fetch(_: Request, env: Env) {
const workersai = createWorkersAI({ binding: env.AI });
const result = await generateText({
model: workersai('@cf/meta/llama-2-7b-chat-int8'),
prompt: 'Write a 50-word essay about hello world.',
});
return new Response(result.text);
},
};

Stream Text

For longer responses, consider streaming responses to provide as the generation completes.

import { createWorkersAI } from 'workers-ai-provider';
import { streamText } from 'ai';
type Env = {
AI: Ai;
};
export default {
async fetch(_: Request, env: Env) {
const workersai = createWorkersAI({ binding: env.AI });
const result = streamText({
model: workersai('@cf/meta/llama-2-7b-chat-int8'),
prompt: 'Write a 50-word essay about hello world.',
});
return result.toTextStreamResponse({
headers: {
// add these headers to ensure that the
// response is chunked and streamed
'Content-Type': 'text/x-unknown',
'content-encoding': 'identity',
'transfer-encoding': 'chunked',
},
});
},
};

Generate Structured Objects

You can provide a Zod schema to generate a structured JSON response.

import { createWorkersAI } from 'workers-ai-provider';
import { generateObject } from 'ai';
import { z } from 'zod';
type Env = {
AI: Ai;
};
export default {
async fetch(_: Request, env: Env) {
const workersai = createWorkersAI({ binding: env.AI });
const result = await generateObject({
model: workersai('@cf/meta/llama-3.1-8b-instruct'),
prompt: 'Generate a Lasagna recipe',
schema: z.object({
recipe: z.object({
ingredients: z.array(z.string()),
description: z.string(),
}),
}),
});
return Response.json(result.object);
},
};