Package dev.langchain4j.memory.chat
Class TokenWindowChatMemory
java.lang.Object
dev.langchain4j.memory.chat.TokenWindowChatMemory
- All Implemented Interfaces:
dev.langchain4j.memory.ChatMemory
This chat memory operates as a sliding window of
maxTokens tokens.
It retains as many of the most recent messages as can fit into the window.
If there isn't enough space for a new message, the oldest one (or multiple) is evicted.
Messages are indivisible. If a message doesn't fit, it is evicted completely.
Once added, a SystemMessage is always retained.
Only one SystemMessage can be held at a time.
If a new SystemMessage with the same content is added, it is ignored.
If a new SystemMessage with different content is added, the previous SystemMessage is removed.
If an AiMessage containing ToolExecutionRequest(s) is evicted,
the following orphan ToolExecutionResultMessage(s) are also automatically evicted
to avoid problems with some LLM providers (such as OpenAI)
that prohibit sending orphan ToolExecutionResultMessage(s) in the request.
The state of chat memory is stored in ChatMemoryStore (SingleSlotChatMemoryStore is used by default).
-
Nested Class Summary
Nested Classes -
Method Summary
Modifier and TypeMethodDescriptionvoidadd(dev.langchain4j.data.message.ChatMessage message) builder()voidclear()id()List<dev.langchain4j.data.message.ChatMessage> messages()static TokenWindowChatMemorywithMaxTokens(int maxTokens, dev.langchain4j.model.Tokenizer tokenizer)
-
Method Details
-
id
- Specified by:
idin interfacedev.langchain4j.memory.ChatMemory
-
add
public void add(dev.langchain4j.data.message.ChatMessage message) - Specified by:
addin interfacedev.langchain4j.memory.ChatMemory
-
messages
- Specified by:
messagesin interfacedev.langchain4j.memory.ChatMemory
-
clear
public void clear()- Specified by:
clearin interfacedev.langchain4j.memory.ChatMemory
-
builder
-
withMaxTokens
public static TokenWindowChatMemory withMaxTokens(int maxTokens, dev.langchain4j.model.Tokenizer tokenizer)
-