Standalone Message Compressor#
A compressor that uses LLM to recontextualize the last message in the history, i.e. create a standalone version of the message that includes necessary context.
ragbits.conversations.history.compressors.llm.StandaloneMessageCompressor
#
StandaloneMessageCompressor(llm: LLM, history_len: int = 5, prompt: type[Prompt[LastMessageAndHistory, str]] | None = None)
Bases: ConversationHistoryCompressor
A compressor that uses LLM to recontextualize the last message in the history, i.e. create a standalone version of the message that includes necessary context.
Initialize the StandaloneMessageCompressor compressor with a LLM.
PARAMETER | DESCRIPTION |
---|---|
llm |
A LLM instance to handle recontextualizing the last message.
TYPE:
|
history_len |
The number of previous messages to include in the history.
TYPE:
|
prompt |
The prompt to use for recontextualizing the last message.
TYPE:
|
Source code in packages/ragbits-conversations/src/ragbits/conversations/history/compressors/llm.py
configuration_key
class-attribute
instance-attribute
#
subclass_from_config
classmethod
#
Initializes the class with the provided configuration. May return a subclass of the class, if requested by the configuration.
PARAMETER | DESCRIPTION |
---|---|
config |
A model containing configuration details for the class.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Self
|
An instance of the class initialized with the provided configuration. |
RAISES | DESCRIPTION |
---|---|
InvalidConfigError
|
The class can't be found or is not a subclass of the current class. |
Source code in packages/ragbits-core/src/ragbits/core/utils/config_handling.py
subclass_from_factory
classmethod
#
Creates the class using the provided factory function. May return a subclass of the class, if requested by the factory.
PARAMETER | DESCRIPTION |
---|---|
factory_path |
A string representing the path to the factory function in the format of "module.submodule:factory_name".
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Self
|
An instance of the class initialized with the provided factory function. |
RAISES | DESCRIPTION |
---|---|
InvalidConfigError
|
The factory can't be found or the object returned is not a subclass of the current class. |
Source code in packages/ragbits-core/src/ragbits/core/utils/config_handling.py
subclass_from_defaults
classmethod
#
subclass_from_defaults(defaults: CoreConfig, factory_path_override: str | None = None, yaml_path_override: Path | None = None) -> Self
Tries to create an instance by looking at default configuration file, and default factory function. Takes optional overrides for both, which takes a higher precedence.
PARAMETER | DESCRIPTION |
---|---|
defaults |
The CoreConfig instance containing default factory and configuration details.
TYPE:
|
factory_path_override |
A string representing the path to the factory function in the format of "module.submodule:factory_name".
TYPE:
|
yaml_path_override |
A string representing the path to the YAML file containing the Ragstack instance configuration.
TYPE:
|
RAISES | DESCRIPTION |
---|---|
InvalidConfigError
|
If the default factory or configuration can't be found. |
Source code in packages/ragbits-core/src/ragbits/core/utils/config_handling.py
from_config
classmethod
#
Initializes the class with the provided configuration.
PARAMETER | DESCRIPTION |
---|---|
config |
A dictionary containing configuration details for the class.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Self
|
An instance of the class initialized with the provided configuration. |
Source code in packages/ragbits-core/src/ragbits/core/utils/config_handling.py
compress
async
#
Contextualize the last message in the conversation history.
PARAMETER | DESCRIPTION |
---|---|
conversation |
List of dicts with "role" and "content" keys, representing the chat history so far. The most recent message should be from the user.
TYPE:
|
Source code in packages/ragbits-conversations/src/ragbits/conversations/history/compressors/llm.py
ragbits.conversations.history.compressors.llm.LastMessageAndHistory
#
ragbits.conversations.history.compressors.llm.StandaloneMessageCompressorPrompt
#
Bases: Prompt[LastMessageAndHistory, str]
A prompt for recontextualizing the last message in the history.
Source code in packages/ragbits-core/src/ragbits/core/prompt/prompt.py
chat
property
#
Returns the conversation in the standard OpenAI chat format.
RETURNS | DESCRIPTION |
---|---|
ChatFormat
|
A list of dictionaries, each containing the role and content of a message.
TYPE:
|
json_mode
property
#
Returns whether the prompt should be sent in JSON mode.
RETURNS | DESCRIPTION |
---|---|
bool
|
Whether the prompt should be sent in JSON mode.
TYPE:
|
few_shots
class-attribute
instance-attribute
#
rendered_system_prompt
instance-attribute
#
rendered_system_prompt = _render_template(system_prompt_template, input_data) if system_prompt_template else None
rendered_user_prompt
instance-attribute
#
system_prompt
class-attribute
instance-attribute
#
system_prompt = '\n Given a new message and a history of the conversation, create a standalone version of the message.\n If the message references any context from history, it should be added to the message itself.\n Return only the recontextualized message.\n Do NOT return the history, do NOT answer the question, and do NOT add context irrelevant to the message.\n '
user_prompt
class-attribute
instance-attribute
#
user_prompt = '\n Message:\n {{ last_message }}\n\n History:\n {% for message in history %}\n * {{ message }}\n {% endfor %}\n '
output_schema
#
Returns the schema of the desired output. Can be used to request structured output from the LLM API or to validate the output. Can return either a Pydantic model or a JSON schema.
RETURNS | DESCRIPTION |
---|---|
dict | type[BaseModel] | None
|
Optional[Dict | Type[BaseModel]]: The schema of the desired output or the model describing it. |
Source code in packages/ragbits-core/src/ragbits/core/prompt/prompt.py
list_images
#
Returns the schema of the list of images compatible with LLM APIs Returns: list of dictionaries
parse_response
#
Parse the response from the LLM to the desired output type.
PARAMETER | DESCRIPTION |
---|---|
response |
The response from the LLM.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
OutputT
|
The parsed response.
TYPE:
|
RAISES | DESCRIPTION |
---|---|
ResponseParsingError
|
If the response cannot be parsed. |
Source code in packages/ragbits-core/src/ragbits/core/prompt/prompt.py
add_few_shot
#
add_few_shot(user_message: str | InputT, assistant_message: str | OutputT) -> Prompt[InputT, OutputT]
Add a few-shot example to the conversation.
PARAMETER | DESCRIPTION |
---|---|
user_message |
The raw user message or input data that will be rendered using the user prompt template.
TYPE:
|
assistant_message |
The raw assistant response or output data that will be cast to a string or in case of a Pydantic model, to JSON.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Prompt[InputT, OutputT]
|
Prompt[InputT, OutputT]: The current prompt instance in order to allow chaining. |
Source code in packages/ragbits-core/src/ragbits/core/prompt/prompt.py
list_few_shots
#
Returns the few shot examples in the standard OpenAI chat format.
RETURNS | DESCRIPTION |
---|---|
ChatFormat
|
A list of dictionaries, each containing the role and content of a message.
TYPE:
|
Source code in packages/ragbits-core/src/ragbits/core/prompt/prompt.py
to_promptfoo
classmethod
#
Generate a prompt in the promptfoo format from a promptfoo test configuration.
PARAMETER | DESCRIPTION |
---|---|
config |
The promptfoo test configuration.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
ChatFormat
|
The prompt in the format used by promptfoo.
TYPE:
|