Add human feedback using Worker Bindings
This guide explains how to provide human feedback for AI Gateway evaluations using Worker bindings.
Start by sending a prompt to the AI model through your AI Gateway.
const resp = await env.AI.run( "@cf/meta/llama-3.1-8b-instruct", { prompt: "tell me a joke", }, { gateway: { id: "my-gateway", }, },);
const myLogId = env.AI.aiGatewayLogId;
Let the user interact with or evaluate the AI response. This interaction will inform the feedback you send back to the AI Gateway.
Use the patchLog()
method to provide feedback for the AI evaluation.
await env.AI.gateway("my-gateway").patchLog(myLogId, { feedback: 1, // all fields are optional; set values that fit your use case score: 100, metadata: { user: "123", // Optional metadata to provide additional context },});
feedback
: is either-1
for negative or1
to positive,0
is considered not evaluated.score
: A number between 0 and 100.metadata
: An object containing additional contextual information.
The patchLog
method allows you to send feedback, score, and metadata for a specific log ID. All object properties are optional, so you can include any combination of the parameters:
gateway.patchLog("my-log-id", { feedback: 1, score: 100, metadata: { user: "123", },});
Returns: Promise<void>
(Make sure to await
the request.)