SimpleCognitiveMemory Documentation
The SimpleCognitiveMemory is a lightweight class designed to simplify the management of memory for AI systems, particularly when interacting with LLMs. It handles message storage, automatic pruning of old messages, preservation of important prompts, and optional removal of images to optimize token usage.
Key Features
- Capacity Management: Automatically discards older messages when capacity is exceeded.
- Preservation of Crucial Messages: Allows preservation of initial messages, such as LLM instructions or initial prompts.
- Image Pruning: Optionally removes images from messages to reduce the number of tokens and lower memory usage.
Initializing
You can create an instance of SimpleCognitiveMemory
by specifying:
- capacity
: The maximum number of messages to store (default is 5).
- keep_images
: Whether to keep image URLs in messages (default is True
).
- preserve
: A list of messages to be preserved and never pruned (e.g., initial instructions).
from cognition_layer.memory.simple import SimpleCognitiveMemory
memory = SimpleCognitiveMemory(capacity=10, keep_images=False)
Preserving Important Messages
To ensure that critical messages like initial prompts or instructions are never discarded, use the preserve
parameter:
from cognition_layer.memory.simple import SimpleCognitiveMemory
from langchain_core.messages import SystemMessage
initial_instructions = [SystemMessage(content="Follow these rules strictly. [...]")]
memory = SimpleCognitiveMemory(capacity=10, preserve=initial_instructions)
Note: The total number of preserved messages cannot exceed the specified capacity.
Updating Memory with New Messages
To add new messages to the memory:
from langchain_core.messages import HumanMessage
new_messages = [HumanMessage(content="What is the weather today?")]
memory.update(new_messages)
Older messages will be automatically removed when the capacity limit is reached.
Removing Images Automatically
If you want to minimize the token cost by removing images from the memory buffer, set keep_images
to False
during initialization:
memory = SimpleCognitiveMemory(capacity=10, keep_images=False)
Alternatively, you can prune images from the messages manually using the utility function:
from cognition_layer.memory.simple import prune_images_from_messages
prune_images_from_messages(memory.messages)
Retrieving Messages
Access the current state of memory using the messages
property. This will include both preserved and regular messages:
all_messages = memory.messages
Example Usage
from cognition_layer.memory.simple import SimpleCognitiveMemory
from langchain_core.messages import SystemMessage, AIMessage, HumanMessage
# Initialize memory with capacity and preserved instructions
instructions = [SystemMessage(content="You are a helpful assistant.")]
memory = SimpleCognitiveMemory(capacity=5, preserve=instructions, keep_images=False)
# Update memory with new messages
memory.update([AIMessage(content="Hello, how can I assist you today?")])
memory.update([HumanMessage(content="What's the weather today?")])
# Retrieve all stored messages
for msg in memory.messages:
print(msg.content)
Conclusion
The SimpleCognitiveMemory
class simplifies memory handling for LLMs by:
- Automatically managing capacity and removing old messages.
- Preserving crucial prompts or instructions.
- Optionally reducing memory cost by removing images.
This makes it ideal for efficient and scalable LLM-based applications where memory management and token optimization are critical.