API Reference
Learn more about the API reference for embedded function calling.
This wrapper method enables you to do embedded function calling. You pass it the AI binding, model, inputs (messages
array and tools
array), and optional configurations.
AI Binding
Ai- The AI binding, such as
env.AI
.
- The AI binding, such as
model
BaseAiTextGenerationModels- The ID of the model that supports function calling. For example,
@hf/nousresearch/hermes-2-pro-mistral-7b
.
- The ID of the model that supports function calling. For example,
input
Objectmessages
RoleScopedChatInput[]tools
AiTextGenerationToolInputWithFunction[]
config
ObjectstreamFinalResponse
boolean optionalmaxRecursiveToolRuns
number optionalstrictValidation
boolean optionalverbose
boolean optionaltrimFunction
boolean optional - For thetrimFunction
, you can pass itautoTrimTools
, which is another helper method we've devised to automatically choose the correct tools (using an LLM) before sending it off for inference. This means that your final inference call will have fewer input tokens.
This method lets you automatically create tool schemas based on OpenAPI specs, so you don't have to manually write or hardcode the tool schemas. You can pass the OpenAPI spec for any API in JSON or YAML format.
createToolsFromOpenAPISpec
has a config input that allows you to perform overrides if you need to provide headers like Authentication or User-Agent.
spec
string- The OpenAPI specification in either JSON or YAML format, or a URL to a remote OpenAPI specification.
config
Config optional - Configuration options for the createToolsFromOpenAPISpec functionoverrides
ConfigRule[] optionalmatchPatterns
RegExp[] optionaloptions
Object optional {verbose
boolean optional }