Class TokenWindowChatMemory

java.lang.Object
dev.langchain4j.memory.chat.TokenWindowChatMemory
All Implemented Interfaces:
dev.langchain4j.memory.ChatMemory

public class TokenWindowChatMemory extends Object implements dev.langchain4j.memory.ChatMemory
This chat memory operates as a sliding window of maxTokens tokens. It retains as many of the most recent messages as can fit into the window. If there isn't enough space for a new message, the oldest one (or multiple) is evicted. Messages are indivisible. If a message doesn't fit, it is evicted completely.

Once added, a SystemMessage is always retained. Only one SystemMessage can be held at a time. If a new SystemMessage with the same content is added, it is ignored. If a new SystemMessage with different content is added, the previous SystemMessage is removed.

If an AiMessage containing ToolExecutionRequest(s) is evicted, the following orphan ToolExecutionResultMessage(s) are also automatically evicted to avoid problems with some LLM providers (such as OpenAI) that prohibit sending orphan ToolExecutionResultMessage(s) in the request.

The state of chat memory is stored in ChatMemoryStore (InMemoryChatMemoryStore is used by default).

  • Method Details

    • id

      public Object id()
      Specified by:
      id in interface dev.langchain4j.memory.ChatMemory
    • add

      public void add(dev.langchain4j.data.message.ChatMessage message)
      Specified by:
      add in interface dev.langchain4j.memory.ChatMemory
    • messages

      public List<dev.langchain4j.data.message.ChatMessage> messages()
      Specified by:
      messages in interface dev.langchain4j.memory.ChatMemory
    • clear

      public void clear()
      Specified by:
      clear in interface dev.langchain4j.memory.ChatMemory
    • builder

      public static TokenWindowChatMemory.Builder builder()
    • withMaxTokens

      public static TokenWindowChatMemory withMaxTokens(int maxTokens, dev.langchain4j.model.Tokenizer tokenizer)