compass.llm.calling.ChatLLMCaller#
- class ChatLLMCaller(llm_service, system_message, usage_tracker=None, **kwargs)[source]#
Bases:
BaseLLMCallerClass to support chat-like LLM calling functionality
See also
LLMCallerSimple LLM caller, with no memory and no parsing utilities.
JSONFromTextLLMCallerLLM calling functionality that extracts structured data (JSON) from the text-based response.
SchemaOutputLLMCallerLLM calling functionality that allows you to specify the expected output schema as part of the API call.
- Parameters:
llm_service (
compass.services.base.Service) – LLM service used for queries.system_message (
str) – System message to use for chat with LLM.usage_tracker (
UsageTracker, optional) – Optional tracker instance to monitor token usage during LLM calls. By default,None.**kwargs –
Keyword arguments to be passed to the underlying service processing function (i.e. llm_service.call(**kwargs)). Should not contain the following keys:
usage_sub_label
messages
These arguments are provided by this caller object.
Methods
call(content[, usage_sub_label])Chat with the LLM