Class AzureOpenAiChatModel

java.lang.Object
dev.langchain4j.model.azure.AzureOpenAiChatModel
All Implemented Interfaces:
dev.langchain4j.model.chat.ChatModel

public class AzureOpenAiChatModel extends Object implements dev.langchain4j.model.chat.ChatModel
Represents an OpenAI language model, hosted on Azure, that has a chat completion interface, such as gpt-3.5-turbo.

Mandatory parameters for initialization are: endpoint and apikey (or an alternate authentication method, see below for more information). Optionally you can set serviceVersion (if not, the latest version is used) and deploymentName (if not, a default name is used). You can also provide your own OpenAIClient instance, if you need more flexibility.

There are 3 authentication methods:

1. Azure OpenAI API Key Authentication: this is the most common method, using an Azure OpenAI API key. You need to provide the OpenAI API Key as a parameter, using the apiKey() method in the Builder, or the apiKey parameter in the constructor: For example, you would use `builder.apiKey("{key}")`.

2. non-Azure OpenAI API Key Authentication: this method allows to use the OpenAI service, instead of Azure OpenAI. You can use the nonAzureApiKey() method in the Builder, which will also automatically set the endpoint to "https://api.openai.com/v1". For example, you would use `builder.nonAzureApiKey("{key}")`. The constructor requires a KeyCredential instance, which can be created using `new AzureKeyCredential("{key}")`, and doesn't set up the endpoint.

3. Azure OpenAI client with Microsoft Entra ID (formerly Azure Active Directory) credentials. - This requires to add the `com.azure:azure-identity` dependency to your project, which is an optional dependency to this library. - You need to provide a TokenCredential instance, using the tokenCredential() method in the Builder, or the tokenCredential parameter in the constructor. As an example, DefaultAzureCredential can be used to authenticate the client: Set the values of the client ID, tenant ID, and client secret of the AAD application as environment variables: AZURE_CLIENT_ID, AZURE_TENANT_ID, AZURE_CLIENT_SECRET. Then, provide the DefaultAzureCredential instance to the builder: `builder.tokenCredential(new DefaultAzureCredentialBuilder().build())`.

  • Nested Class Summary

    Nested Classes
    Modifier and Type
    Class
    Description
    static class 
     
  • Constructor Summary

    Constructors
    Constructor
    Description
    AzureOpenAiChatModel(com.azure.ai.openai.OpenAIClient client, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, Set<dev.langchain4j.model.chat.Capability> capabilities)
     
    AzureOpenAiChatModel(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String,String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
     
    AzureOpenAiChatModel(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String,String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
     
    AzureOpenAiChatModel(String endpoint, String serviceVersion, String apiKey, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String,String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
     
  • Method Summary

    Modifier and Type
    Method
    Description
     
    dev.langchain4j.model.chat.response.ChatResponse
    chat(dev.langchain4j.model.chat.request.ChatRequest request)
     
    List<dev.langchain4j.model.chat.listener.ChatModelListener>
     
    dev.langchain4j.model.ModelProvider
     
    Set<dev.langchain4j.model.chat.Capability>
     

    Methods inherited from class java.lang.Object

    clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait

    Methods inherited from interface dev.langchain4j.model.chat.ChatModel

    chat, chat, chat, defaultRequestParameters, doChat
  • Constructor Details

    • AzureOpenAiChatModel

      public AzureOpenAiChatModel(com.azure.ai.openai.OpenAIClient client, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, Set<dev.langchain4j.model.chat.Capability> capabilities)
    • AzureOpenAiChatModel

      public AzureOpenAiChatModel(String endpoint, String serviceVersion, String apiKey, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String,String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
    • AzureOpenAiChatModel

      public AzureOpenAiChatModel(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String,String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
    • AzureOpenAiChatModel

      public AzureOpenAiChatModel(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, Integer maxTokens, Double temperature, Double topP, Map<String,Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, @Deprecated com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, dev.langchain4j.model.chat.request.ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners, String userAgentSuffix, Map<String,String> customHeaders, Set<dev.langchain4j.model.chat.Capability> capabilities)
  • Method Details

    • supportedCapabilities

      public Set<dev.langchain4j.model.chat.Capability> supportedCapabilities()
      Specified by:
      supportedCapabilities in interface dev.langchain4j.model.chat.ChatModel
    • chat

      public dev.langchain4j.model.chat.response.ChatResponse chat(dev.langchain4j.model.chat.request.ChatRequest request)
      Specified by:
      chat in interface dev.langchain4j.model.chat.ChatModel
    • listeners

      public List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners()
      Specified by:
      listeners in interface dev.langchain4j.model.chat.ChatModel
    • provider

      public dev.langchain4j.model.ModelProvider provider()
      Specified by:
      provider in interface dev.langchain4j.model.chat.ChatModel
    • builder

      public static AzureOpenAiChatModel.Builder builder()