Skip to content

JSON Mode

When we want text-generation AI models to interact with databases, services, and external systems programmatically, typically when using tool calling or building AI agents, we must have structured response formats rather than natural language.

Workers AI supports JSON Mode, enabling applications to request a structured output response when interacting with AI models.

Schema

JSON Mode is compatible with OpenAI’s implementation; to enable add the response_format property to the request object using the following convention:

{
response_format: {
title: "JSON Mode",
type: "object",
properties: {
type: {
type: "string",
enum: ["json_object", "json_schema"],
},
json_schema: {},
}
}
}

Where json_schema must be a valid JSON Schema declaration.

JSON Mode example

When using JSON Format, pass the schema as in the example below as part of the request you send to the LLM.

{
"messages": [
{
"role": "system",
"content": "Extract data about a country."
},
{
"role": "user",
"content": "Tell me about India."
}
],
"response_format": {
"type": "json_schema",
"json_schema": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"capital": {
"type": "string"
},
"languages": {
"type": "array",
"items": {
"type": "string"
}
}
},
"required": [
"name",
"capital",
"languages"
]
}
}
}

The LLM will follow the schema, and return a response such as below:

{
"response": {
"name": "India",
"capital": "New Delhi",
"languages": [
"Hindi",
"English",
"Bengali",
"Telugu",
"Marathi",
"Tamil",
"Gujarati",
"Urdu",
"Kannada",
"Odia",
"Malayalam",
"Punjabi",
"Sanskrit"
]
}
}

As you can see, the model is complying with the JSON schema definition in the request and responding with a validated JSON object.

Supported Models

This is the list of models that now support JSON Mode:

We will continue extending this list to keep up with new, and requested models.

Note that Workers AI can't guarantee that the model responds according to the requested JSON Schema. Depending on the complexity of the task and adequacy of the JSON Schema, the model may not be able to satisfy the request in extreme situations. If that's the case, then an error JSON Mode couldn't be met is returned and must be handled.

JSON Mode currently doesn't support streaming.