Class AzureOpenAiStreamingChatModel

java.lang.Object
io.quarkiverse.langchain4j.azure.openai.AzureOpenAiStreamingChatModel
All Implemented Interfaces:
dev.langchain4j.model.chat.StreamingChatLanguageModel, dev.langchain4j.model.chat.TokenCountEstimator

public class AzureOpenAiStreamingChatModel extends Object implements dev.langchain4j.model.chat.StreamingChatLanguageModel, dev.langchain4j.model.chat.TokenCountEstimator
Represents an OpenAI language model, hosted on Azure, that has a chat completion interface, such as gpt-3.5-turbo. The model's response is streamed token by token and should be handled with StreamingResponseHandler.

Mandatory parameters for initialization are: apiVersion, apiKey, and either endpoint OR resourceName and deploymentName.

There are two primary authentication methods to access Azure OpenAI:

1. API Key Authentication: For this type of authentication, HTTP requests must include the API Key in the "api-key" HTTP header.

2. Azure Active Directory Authentication: For this type of authentication, HTTP requests must include the authentication/access token in the "Authorization" HTTP header.

More information

Please note, that currently, only API Key authentication is supported by this class, second authentication option will be supported later.

  • Constructor Details

  • Method Details

    • doChat

      public void doChat(dev.langchain4j.model.chat.request.ChatRequest chatRequest, dev.langchain4j.model.chat.response.StreamingChatResponseHandler handler)
      Specified by:
      doChat in interface dev.langchain4j.model.chat.StreamingChatLanguageModel
    • estimateTokenCount

      public int estimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages)
      Specified by:
      estimateTokenCount in interface dev.langchain4j.model.chat.TokenCountEstimator
    • builder

      public static AzureOpenAiStreamingChatModel.Builder builder()