Chat Operations

Chat | Answer Prompt

The Chat answer prompt operation is a simple prompt request operation to the configured LLM. It uses a plain text prompt as input and responds with a plain text answer.

Chat Answer Prompt

Input Configuration

Module Configuration

This refers to the MuleSoft AI Chain LLM Configuration set up in the Getting Started section.

General Operation Fields

  • Prompt: Contains the prompt as plain text for the operation.

XML Configuration

Below is the XML configuration for this operation:

<ms-chainai:chat-answer-prompt 
  doc:name="Chat answer prompt" 
  doc:id="8ba9d534-f801-4ac7-8a31-11462fc5204b" 
  config-ref="MuleChain_AI_Llm_configuration" 
  prompt="#[payload.prompt]"  
/>

Output Configuration

Response Payload

This operation responds with a json payload containing the main LLM response. Additionally, token usage and other metadata are returned as attributes.

Example Response Payload

{
    "response": "The capital of Switzerland is Bern. It's known for its well-preserved medieval old town, which is a UNESCO World Heritage site. Bern became the capital of Switzerland in 1848. The Swiss parliament, the Federal Assembly, is located in Bern."
}

Attributes

Along with the JSON payload, the operation also returns attributes, including information about token usage:

{
  "tokenUsage": {
      "outputCount": 9,
      "totalCount": 18,
      "inputCount": 9
  },
  "additionalAttributes": {}
}
  • tokenUsage
    • outputCount: The number of tokens used to generate the output.
    • totalCount: The total number of tokens used for both input and output.
    • inputCount: The number of tokens used to process the input.

Example Use Cases

This operation can be used in the following scenarios:

  • Basic Chatbots: Answer simple user prompts.
  • Customer Service Queries: Provide direct answers to frequently asked questions.

Chat | Answer Prompt with Memory

The Chat answer prompt with memory operation is useful when you want to retain conversation history for a multi-user chat operation.

Chat Answer Prompt with Memory

Input Configuration

Module Configuration

This refers to the MuleSoft AI Chain LLM Configuration set up in the Getting Started section.

General Operation Fields

  • Data: Contains the prompt for the operation.
  • Memory Name: The name of the conversation. For multi-user support, enter the unique user ID.
  • DB File Path: The path to the in-memory database for storing the conversation history. You can also use a DataWeave expression for this field, e.g., #["/Users/john.wick/Desktop/mac-demo/db/" ++ payload.memoryName].
  • Max Messages: The maximum number of messages to remember for the conversation defined in Memory Name.

XML Configuration

Below is the XML configuration for this operation:

<ms-aichain:chat-answer-prompt-with-memory
  doc:name="Chat answer prompt with memory"
  doc:id="7e62e70e-eff7-4080-bb20-3d162bb84c39"
  config-ref="MuleSoft_AI_Chain_Config"
  memoryName="#[payload.memoryName]"
  dbFilePath='#["/Users/john.wick/Desktop/mac-demo/db/" ++ payload.memoryName]'
  maxMessages="#[payload.maxMessages]">
  <ms-aichain:data><![CDATA[#[payload.prompt]]]></ms-aichain:data>
</ms-aichain:chat-answer-prompt-with-memory>

Output Configuration

Response Payload

This operation responds with a json payload containing the main LLM response, with additional metadata stored in attributes.

Example Response Payload

{
    "response": "I'm sorry, I do not have access to personal information such as your name."
}

Attributes

Along with the JSON payload, the operation also returns attributes, including information about token usage:

{
  "tokenUsage": {
    "outputCount": 13,
    "totalCount": 44,
    "inputCount": 31
  },
  "additionalAttributes": {
    "memoryName": "memory",
    "maxMessages": "2",
    "dbFilePath": "/.../memory.db"
  }
}
  • tokenUsage
    • outputCount: The number of tokens used to generate the output.
    • totalCount: The total number of tokens used for both input and output.
    • inputCount: The number of tokens used to process the input.
  • additionalAttributes
    • memoryName: The name of the memory used in the conversation.
    • maxMessages: The maximum number of messages considered in the conversation memory.
    • dbFilePath: The file path where the conversation memory is stored.

Example Use Cases

This operation is particularly useful in scenarios where you want to retain a conversation history and provide context to the LLM:

  • Customer Support Chats: Retaining the context of ongoing support conversations.
  • Multi-user Chat Applications: Maintaining conversation history for different users.
  • Personal Assistants: Keeping track of user interactions to provide more relevant responses.