Docs
Function Calling & Tools

Function Calling & Tools

This section provides background on Function Calling & Tools (opens in a new tab) using the MuleSoft AI Chain (MAC) Project.

types-of-solution

The concept of Tools, also known as Function Calling allows the LLM to, when necessary, one or more available tools, usually defined by the developer. Natively the LLM provides a tool execution request of tools to be used to answer a users query.

In the MAC Project, we define tools as API resources running on the MuleSoft Anypoint Runtimes, and published on Anypoint Exchange. There are 2 main operations available:

  • Tools Use AI Service: The Tools | Use AI Service operation is useful if you want to create autonomous agents that can use external tools whenever a prompt cannot be answered directly by the AI model. A prerequisite for using Tools is to prepare a tools.config.json file, which includes all the necessary API information for the operation to execute APIs successfully.
  • Tools Use AI Native: The Tools | Use Native Template operation is useful if you want to create autonomous agents that can use external tools whenever a prompt cannot be answered directly by the AI model. This operation only provides a request to execute the Tools provided in the payload. It doesn't execute them.

Tools Use AI Service

This operation has implemented tool execution out-of-the-box. You can provide the available tools in a tool.config.json file, which the LLM will assess and executes based on the users query and provided tool implementation details.

Supported by Connectors:

  • MuleSoft AI Chain
  • Einstein AI
  • Amazon Bedrock


Tools Use AI Native

This operation has implemented tool execution out-of-the-box. You can provide the available tools in a tool.config.json file, which the LLM will assess and executes based on the users query and provided tool implementation details.

Supported by Connectors:

  • MuleSoft AI Chain
  • MAC Inference