Package dev.langchain4j.model.openai
Class OpenAiStreamingChatModel
java.lang.Object
dev.langchain4j.model.openai.OpenAiStreamingChatModel
- All Implemented Interfaces:
StreamingChatLanguageModel
,TokenCountEstimator
public class OpenAiStreamingChatModel
extends Object
implements StreamingChatLanguageModel, TokenCountEstimator
Represents an OpenAI language model with a chat completion interface, such as gpt-3.5-turbo and gpt-4.
The model's response is streamed token by token and should be handled with
StreamingResponseHandler
.
You can find description of parameters here.-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class
-
Constructor Summary
ConstructorsConstructorDescriptionOpenAiStreamingChatModel
(String baseUrl, String apiKey, String organizationId, String modelName, Double temperature, Double topP, List<String> stop, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Integer seed, String user, Duration timeout, Proxy proxy, Boolean logRequests, Boolean logResponses, Tokenizer tokenizer) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
int
estimateTokenCount
(List<ChatMessage> messages) void
generate
(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler) void
generate
(List<ChatMessage> messages, StreamingResponseHandler<AiMessage> handler) void
generate
(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler) static OpenAiStreamingChatModel
withApiKey
(String apiKey) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel
generate
Methods inherited from interface dev.langchain4j.model.chat.TokenCountEstimator
estimateTokenCount, estimateTokenCount, estimateTokenCount, estimateTokenCount
-
Constructor Details
-
OpenAiStreamingChatModel
public OpenAiStreamingChatModel(String baseUrl, String apiKey, String organizationId, String modelName, Double temperature, Double topP, List<String> stop, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Integer seed, String user, Duration timeout, Proxy proxy, Boolean logRequests, Boolean logResponses, Tokenizer tokenizer)
-
-
Method Details
-
generate
- Specified by:
generate
in interfaceStreamingChatLanguageModel
-
generate
public void generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler) - Specified by:
generate
in interfaceStreamingChatLanguageModel
-
generate
public void generate(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler) - Specified by:
generate
in interfaceStreamingChatLanguageModel
-
estimateTokenCount
- Specified by:
estimateTokenCount
in interfaceTokenCountEstimator
-
withApiKey
-
builder
-