Function calling
BetaFunction calling enables people to take Large Language Models (LLMs) and use the model response to execute functions or interact with external APIs. The developer usually defines a set of functions and the required input schema for each function, which we call tools
. The model then intelligently understands when it needs to do a tool call, and it returns a JSON output which the user needs to feed to another function or API.
In essence, function calling allows you to perform actions with LLMs by executing code or making additional API calls.
Workers AI has embedded function calling which allows you to execute function code alongside your inference calls. We have a package called @cloudflare/ai-utils
↗ to help facilitate this, which we have open-sourced on Github ↗.
For industry-standard function calling, take a look at the documentation on Traditional Function Calling.
To show you the value of embedded function calling, take a look at the example below that compares traditional function calling with embedded function calling. Embedded function calling allowed us to cut down the lines of code from 77 to 31.
There are open-source models which have been fine-tuned to do function calling. When browsing our model catalog, look for models with the function calling property beside it. For example, @hf/nousresearch/hermes-2-pro-mistral-7b is a fine-tuned variant of Mistral 7B that you can use for function calling.