Get Started
This guide will instruct you through setting up and deploying your first Workers AI project with embedded function calling. You will use Workers, a Workers AI binding, the ai-utils package
↗, and a large language model (LLM) to deploy your first AI-powered application on the Cloudflare global network with embedded function calling.
Follow the Workers AI Get Started Guide until step 2.
Next, run the following command in your project repository to install the Worker AI utilities package.
Update the index.ts
file in your application directory with the following code:
This example imports the utils with import { runWithTools} from "@cloudflare/ai-utils"
and follows the API reference below.
Moreover, in this example we define and describe a list of tools that the LLM can leverage to respond to the user query. Here, the list contains of only one tool, the sum
function.
Abstracted by the runWithTools
function, the following steps occur:
sequenceDiagram participant Worker as Worker participant WorkersAI as Workers AI Worker->>+WorkersAI: Send messages, function calling prompt, and available tools WorkersAI->>+Worker: Select tools and arguments for function calling Worker-->>-Worker: Execute function Worker-->>+WorkersAI: Send messages, function calling prompt and function result WorkersAI-->>-Worker: Send response incorporating function output
The ai-utils package
is also open-sourced on Github ↗.
Follow steps 4 and 5 of the Workers AI Get Started Guide for local development and deployment.
For more details, refer to API reference.