Class AzureOpenAiStreamingChatModel
java.lang.Object
io.quarkiverse.langchain4j.azure.openai.AzureOpenAiStreamingChatModel
- All Implemented Interfaces:
dev.langchain4j.model.chat.StreamingChatLanguageModel
,dev.langchain4j.model.chat.TokenCountEstimator
public class AzureOpenAiStreamingChatModel
extends Object
implements dev.langchain4j.model.chat.StreamingChatLanguageModel, dev.langchain4j.model.chat.TokenCountEstimator
Represents an OpenAI language model, hosted on Azure, that has a chat completion interface, such as gpt-3.5-turbo.
The model's response is streamed token by token and should be handled with
StreamingResponseHandler
.
Mandatory parameters for initialization are: apiVersion
, apiKey
, and either endpoint
OR
resourceName
and deploymentName
.
There are two primary authentication methods to access Azure OpenAI:
1. API Key Authentication: For this type of authentication, HTTP requests must include the API Key in the "api-key" HTTP header.
2. Azure Active Directory Authentication: For this type of authentication, HTTP requests must include the authentication/access token in the "Authorization" HTTP header.
Please note, that currently, only API Key authentication is supported by this class, second authentication option will be supported later.
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionAzureOpenAiStreamingChatModel
(String endpoint, String apiVersion, String apiKey, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Proxy proxy, Boolean logRequests, Boolean logResponses) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
int
estimateTokenCount
(List<dev.langchain4j.data.message.ChatMessage> messages) void
generate
(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) void
generate
(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) void
generate
(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel
generate
Methods inherited from interface dev.langchain4j.model.chat.TokenCountEstimator
estimateTokenCount, estimateTokenCount, estimateTokenCount, estimateTokenCount
-
Constructor Details
-
AzureOpenAiStreamingChatModel
public AzureOpenAiStreamingChatModel(String endpoint, String apiVersion, String apiKey, dev.langchain4j.model.Tokenizer tokenizer, Double temperature, Double topP, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Duration timeout, Proxy proxy, Boolean logRequests, Boolean logResponses)
-
-
Method Details
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) - Specified by:
generate
in interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) - Specified by:
generate
in interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) - Specified by:
generate
in interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
estimateTokenCount
- Specified by:
estimateTokenCount
in interfacedev.langchain4j.model.chat.TokenCountEstimator
-
builder
-