Docs
MAC Inference
Agent

Agent Operations

Agent | Define Prompt Template

The Agent define prompt template operation is essential for using specific prompt templates with your LLMs. This operation allows you to define and compose AI functions using plain text, enabling the creation of natural language prompts, generating responses, extracting information, invoking other prompts, or performing any text-based task.

Agent Prompt Template

Input Configuration

Module Configuration

This refers to the MAC Inference LLM Configuration set up in the Getting Started section.

General Operation Fields

  • Template: Contains the prompt template for the operation.
  • Instructions: Provides instructions for the LLM, outlining the goals of the task.
  • Dataset: Specifies the dataset to be evaluated by the LLM using the provided template and instructions.

XML Configuration

Below is the XML configuration for this operation:

<mac-inference:agent-define-prompt-template doc:name="Agent define prompt template" doc:id="5944353c-c784-4268-9f16-c036e5eaf8e3" config-ref="Github_Models" >
			<mac-inference:template ><![CDATA[#[payload.template]]]></mac-inference:template>
			<mac-inference:instructions ><![CDATA[#[payload.instructions]]]></mac-inference:instructions>
			<mac-inference:data ><![CDATA[#[payload.dataset]]]></mac-inference:data>
</mac-inference:agent-define-prompt-template>
 

Output Configuration

Response Payload

This operation responds with a json payload containing the main LLM response. Additionally, attributes such as token usage are included as part of the metadata (attributes) but not within the main payload.

Example Response Payload

{
  "response": "{\n  \"type\": \"positive\",\n  \"response\": \"Thank you for your positive feedback on the training last week. We are glad to hear that you had a great experience. Have a nice day!\"\n}"
}

Attributes

Along with the JSON payload, the operation also returns attributes, which include information about token usage:

{
  "tokenUsage": {
      "outputCount": 9,
      "totalCount": 18,
      "inputCount": 9
  },
  "additionalAttributes": {}
}
  • tokenUsage: The token usage metadata returned as attributes.
    • outputCount: The number of tokens used to generate the output.
    • totalCount: The total number of tokens used for input and output.
    • inputCount: The number of tokens used to process the input.

Example Use Cases

Prompt templates can be applied in various scenarios, such as:

  • Customer Service Agents: Enhance customer service by providing case summaries, case classifications, summarizing large datasets, and more.
  • Sales Operation Agents: Assist sales teams in writing sales emails, summarizing cases for specific accounts, assessing the probability of closing deals, and more.
  • Marketing Agents: Support marketing teams in generating product descriptions, creating newsletters, planning social media campaigns, and more.