∞ Documents Chat Engine¶
Documentation¶
- Class name:
LLMChatEngine
- Category:
SALT/Language Toolkit/Querying
- Output node:
False
The LLMChatEngine node facilitates interactive chat sessions using a language model, enabling users to input queries and receive text responses. It manages the instantiation and resetting of a chat engine based on user input, ensuring dynamic and contextually relevant interactions.
Input types¶
Required¶
llm_index
- Represents the index of the language learning model to be used for the chat session, crucial for initializing or resetting the chat engine to ensure responses are generated accurately and contextually.
- Comfy dtype:
LLM_INDEX
- Python dtype:
int
query
- The user's input query as a string, which is processed by the chat engine to generate a relevant text response. This input is essential for driving the conversation forward.
- Comfy dtype:
STRING
- Python dtype:
str
Optional¶
reset_engine
- A boolean flag indicating whether to reset the chat engine before processing the current query, allowing for fresh interactions without prior context.
- Comfy dtype:
BOOLEAN
- Python dtype:
bool
Output types¶
string
- Comfy dtype:
STRING
- The text response generated by the chat engine in response to the user's query.
- Python dtype:
str
- Comfy dtype:
Usage tips¶
- Infra type:
CPU
- Common nodes: unknown
Source code¶
class LLMChatEngine:
def __init__(self):
self.chat_engine = None
@classmethod
def INPUT_TYPES(cls):
return {
"required": {
"llm_index": ("LLM_INDEX",),
"query": ("STRING", {"multiline": True, "dynamicPrompts": False, "placeholder": "Ask a question"}),
},
"optional": {
"reset_engine": ("BOOLEAN", {"default": False})
}
}
RETURN_TYPES = ("STRING",)
FUNCTION = "chat"
CATEGORY = f"{MENU_NAME}/{SUB_MENU_NAME}/Querying"
def chat(self, llm_index, query:str, reset_engine:bool = False) -> str:
if not self.chat_engine or reset_engine:
self.chat_engine = llm_index.as_chat_engine()
response = self.chat_engine.chat(query)
pprint(response, indent=4)
return (response.response,)