Class AzureOpenAiStreamingChatModel
- All Implemented Interfaces:
dev.langchain4j.model.chat.StreamingChatLanguageModel,dev.langchain4j.model.chat.TokenCountEstimator
StreamingResponseHandler.
Mandatory parameters for initialization are: endpoint and apikey (or an alternate authentication method, see below for more information). Optionally you can set serviceVersion (if not, the latest version is used) and deploymentName (if not, a default name is used). You can also provide your own OpenAIClient instance, if you need more flexibility.
There are 3 authentication methods:
1. Azure OpenAI API Key Authentication: this is the most common method, using an Azure OpenAI API key. You need to provide the OpenAI API Key as a parameter, using the apiKey() method in the Builder, or the apiKey parameter in the constructor: For example, you would use `builder.apiKey("{key}")`.
2. non-Azure OpenAI API Key Authentication: this method allows to use the OpenAI service, instead of Azure OpenAI. You can use the nonAzureApiKey() method in the Builder, which will also automatically set the endpoint to "https://api.openai.com/v1". For example, you would use `builder.nonAzureApiKey("{key}")`. The constructor requires a KeyCredential instance, which can be created using `new AzureKeyCredential("{key}")`, and doesn't set up the endpoint.
3. Azure OpenAI client with Microsoft Entra ID (formerly Azure Active Directory) credentials. - This requires to add the `com.azure:azure-identity` dependency to your project, which is an optional dependency to this library. - You need to provide a TokenCredential instance, using the tokenCredential() method in the Builder, or the tokenCredential parameter in the constructor. As an example, DefaultAzureCredential can be used to authenticate the client: Set the values of the client ID, tenant ID, and client secret of the AAD application as environment variables: AZURE_CLIENT_ID, AZURE_TENANT_ID, AZURE_CLIENT_SECRET. Then, provide the DefaultAzureCredential instance to the builder: `builder.tokenCredential(new DefaultAzureCredentialBuilder().build())`.
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionAzureOpenAiStreamingChatModel(com.azure.ai.openai.OpenAIClient client, com.azure.ai.openai.OpenAIAsyncClient asyncClient, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, Set<dev.langchain4j.model.chat.Capability> capabilities) AzureOpenAiStreamingChatModel(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities) AzureOpenAiStreamingChatModel(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities) AzureOpenAiStreamingChatModel(String endpoint, String serviceVersion, String apiKey, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()voidchat(dev.langchain4j.model.chat.request.ChatRequest request, dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler) intestimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages) voidgenerate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) voidgenerate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) voidgenerate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel
chat, chat, defaultRequestParameters, doChat, generate, generate, listeners, supportedCapabilitiesMethods inherited from interface dev.langchain4j.model.chat.TokenCountEstimator
estimateTokenCount, estimateTokenCount, estimateTokenCount, estimateTokenCount
-
Constructor Details
-
AzureOpenAiStreamingChatModel
public AzureOpenAiStreamingChatModel(com.azure.ai.openai.OpenAIClient client, com.azure.ai.openai.OpenAIAsyncClient asyncClient, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, Set<dev.langchain4j.model.chat.Capability> capabilities) -
AzureOpenAiStreamingChatModel
public AzureOpenAiStreamingChatModel(String endpoint, String serviceVersion, String apiKey, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities) -
AzureOpenAiStreamingChatModel
public AzureOpenAiStreamingChatModel(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities) -
AzureOpenAiStreamingChatModel
public AzureOpenAiStreamingChatModel(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, dev.langchain4j.model.Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
-
-
Method Details
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) - Specified by:
generatein interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) - Specified by:
generatein interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
generate
public void generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification, dev.langchain4j.model.StreamingResponseHandler<dev.langchain4j.data.message.AiMessage> handler) - Specified by:
generatein interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
chat
public void chat(dev.langchain4j.model.chat.request.ChatRequest request, dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler) - Specified by:
chatin interfacedev.langchain4j.model.chat.StreamingChatLanguageModel
-
estimateTokenCount
- Specified by:
estimateTokenCountin interfacedev.langchain4j.model.chat.TokenCountEstimator
-
builder
-