Package dev.langchain4j.model.chat
Interface StreamingChatLanguageModel
-
- All Implemented Interfaces:
public interface StreamingChatLanguageModel
Represents a language model that has a chat API and can stream a response one token at a time.
-
-
Method Summary
Modifier and Type Method Description void
chat(ChatRequest chatRequest, StreamingChatResponseHandler handler)
This is the main API to interact with the chat model. ChatRequestParameters
defaultRequestParameters()
List<ChatModelListener>
listeners()
ModelProvider
provider()
void
doChat(ChatRequest chatRequest, StreamingChatResponseHandler handler)
void
chat(String userMessage, StreamingChatResponseHandler handler)
void
chat(List<ChatMessage> messages, StreamingChatResponseHandler handler)
Set<Capability>
supportedCapabilities()
-
-
Method Detail
-
chat
void chat(ChatRequest chatRequest, StreamingChatResponseHandler handler)
This is the main API to interact with the chat model.
A temporary default implementation of this method is necessary until all StreamingChatLanguageModel implementations adopt it. It should be removed once that occurs.
- Parameters:
chatRequest
- a ChatRequest, containing all the inputs to the LLMhandler
- a StreamingChatResponseHandler that will handle streaming response from the LLM
-
defaultRequestParameters
ChatRequestParameters defaultRequestParameters()
-
listeners
List<ChatModelListener> listeners()
-
provider
ModelProvider provider()
-
doChat
void doChat(ChatRequest chatRequest, StreamingChatResponseHandler handler)
-
chat
void chat(String userMessage, StreamingChatResponseHandler handler)
-
chat
void chat(List<ChatMessage> messages, StreamingChatResponseHandler handler)
-
supportedCapabilities
Set<Capability> supportedCapabilities()
-
-
-
-