Docs
MAC Inference
Connector Overview

MAC Inference Connector Overview

MAC Inference Connector provides access to Inference Offering for Large Language Models i.e. Groq, Hugging Face, Github Models, etc.

What is the MAC Inference Connector?

The MAC Inference Connector is designed to help developers easily build and manage AI-driven agents within the MuleSoft Anypoint Platform. It provides operations to interface directly with the API of the Inference Providers.

Connector Overview

Key Features

The MAC Inference Connector simplifies AI integration into MuleSoft applications with:

  • Seamless Interaction with Inferenced LLMs: Effortlessly integrate and utilize large language models (LLMs) for tasks such as natural language processing, text generation, and more, including advanced features like text generation, analysis, and complex language tasks.
  • Embeddings and Search: Combine with MAC Vectors to handle tasks like text similarity, document search, and clustering, all within MuleSoft applications.
  • Optimized Performance: Designed for high efficiency and performance in enterprise-grade MuleSoft applications, ensuring smooth handling of AI operations.
  • Comprehensive AI Tools and Services: Access a wide array of AI-driven features, including Retrieval-Augmented Generation (RAG) for document retrieval, dynamic tool integration (Function Calling) for tasks like recognition and manipulation.

Supported Inference Providers

The MAC Inference Connector supports the following Inference Offerings:

Table of Supported Operations by Inference Offering

Not all the operations are supported by each embedding store, following a detailed view.


NameGitHub ModelsGroq AIHugging FaceOllamaOpenRouterPortkeyCerebras
Chat answer prompt
Chat completion
Agent define prompt template
Tools native template

-* depends on the model used

Additional Integrations

MAC Inference Connector integrates seamlessly with other MAC Projects AI Connectors and the MuleSoft ecosystem, offering enhanced functionalities.