Interface StreamingChatLanguageModel
-
- All Implemented Interfaces:
public interface StreamingChatLanguageModel
Represents a language model that has a chat API and can stream a response one token at a time.
-
-
Method Summary
Modifier and Type Method Description void
chat(ChatRequest chatRequest, StreamingChatResponseHandler handler)
This is the main API to interact with the chat model. ChatRequestParameters
defaultRequestParameters()
Set<Capability>
supportedCapabilities()
void
generate(String userMessage, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a message from a user. void
generate(UserMessage userMessage, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a message from a user. abstract void
generate(List<ChatMessage> messages, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a sequence of messages. void
generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a list of messages and a list of tool specifications. void
generate(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a list of messages and a single tool specification. -
-
Method Detail
-
chat
void chat(ChatRequest chatRequest, StreamingChatResponseHandler handler)
This is the main API to interact with the chat model. All the existing generate(...) methods (see below) will be deprecated and removed before 1.0.0 release.
A temporary default implementation of this method is necessary until all StreamingChatLanguageModel implementations adopt it. It should be removed once that occurs.
- Parameters:
chatRequest
- a ChatRequest, containing all the inputs to the LLMhandler
- a StreamingChatResponseHandler that will handle streaming response from the LLM
-
defaultRequestParameters
ChatRequestParameters defaultRequestParameters()
-
supportedCapabilities
Set<Capability> supportedCapabilities()
-
generate
void generate(String userMessage, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a message from a user.
- Parameters:
userMessage
- The message from the user.handler
- The handler for streaming the response.
-
generate
void generate(UserMessage userMessage, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a message from a user.
- Parameters:
userMessage
- The message from the user.handler
- The handler for streaming the response.
-
generate
abstract void generate(List<ChatMessage> messages, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a sequence of messages. Typically, the sequence contains messages in the following order: System (optional) - User - AI - User - AI - User ...
- Parameters:
messages
- A list of messages.handler
- The handler for streaming the response.
-
generate
void generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a list of messages and a list of tool specifications. The response may either be a text message or a request to execute one of the specified tools. Typically, the list contains messages in the following order: System (optional) - User - AI - User - AI - User ...
- Parameters:
messages
- A list of messages.toolSpecifications
- A list of tools that the model is allowed to execute.handler
- The handler for streaming the response.
-
generate
void generate(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler)
Generates a response from the model based on a list of messages and a single tool specification. The model is forced to execute the specified tool. This is usually achieved by setting `tool_choice=ANY` in the LLM provider API.
- Parameters:
messages
- A list of messages.toolSpecification
- The specification of a tool that must be executed.handler
- The handler for streaming the response.
-
-
-
-