Class OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
- Enclosing class:
OpenAiStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()customHeaders(Map<String, String> customHeaders) customQueryParams(Map<String, String> customQueryParams) defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters parameters) Sets default commonChatRequestParametersor OpenAI-specificOpenAiChatRequestParameters.frequencyPenalty(Double frequencyPenalty) httpClientBuilder(HttpClientBuilder httpClientBuilder) logger(org.slf4j.Logger logger) logRequests(Boolean logRequests) logResponses(Boolean logResponses) maxCompletionTokens(Integer maxCompletionTokens) modelName(OpenAiChatModelName modelName) organizationId(String organizationId) parallelToolCalls(Boolean parallelToolCalls) presencePenalty(Double presencePenalty) responseFormat(dev.langchain4j.model.chat.request.ResponseFormat responseFormat) responseFormat(String responseFormat) returnThinking(Boolean returnThinking) This setting is intended for DeepSeek.serviceTier(String serviceTier) strictJsonSchema(Boolean strictJsonSchema) strictTools(Boolean strictTools) temperature(Double temperature)
-
Constructor Details
-
OpenAiStreamingChatModelBuilder
public OpenAiStreamingChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
defaultRequestParameters
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters parameters) Sets default commonChatRequestParametersor OpenAI-specificOpenAiChatRequestParameters.
When a parameter is set via an individual builder method (e.g.,modelName(String)), its value takes precedence over the same parameter set viaChatRequestParameters. -
modelName
-
modelName
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder modelName(OpenAiChatModelName modelName) -
baseUrl
-
apiKey
-
organizationId
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder organizationId(String organizationId) -
projectId
-
temperature
-
topP
-
stop
-
maxTokens
-
maxCompletionTokens
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder maxCompletionTokens(Integer maxCompletionTokens) -
presencePenalty
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder presencePenalty(Double presencePenalty) -
frequencyPenalty
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder frequencyPenalty(Double frequencyPenalty) -
logitBias
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder logitBias(Map<String, Integer> logitBias) -
responseFormat
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(dev.langchain4j.model.chat.request.ResponseFormat responseFormat) -
responseFormat
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(String responseFormat) - See Also:
-
strictJsonSchema
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder strictJsonSchema(Boolean strictJsonSchema) -
seed
-
user
-
strictTools
-
parallelToolCalls
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder parallelToolCalls(Boolean parallelToolCalls) -
store
-
metadata
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder metadata(Map<String, String> metadata) -
serviceTier
-
returnThinking
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder returnThinking(Boolean returnThinking) This setting is intended for DeepSeek.Controls whether to return thinking/reasoning text (if available) inside
AiMessage.thinking()and whether to invoke theStreamingChatResponseHandler.onPartialThinking(PartialThinking)callback. Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thereasoning_contentfield from the API response and return it inside theAiMessage.Disabled by default. If enabled, the thinking text will be stored within the
AiMessageand may be persisted. -
timeout
-
logRequests
-
logResponses
-
logger
- Parameters:
logger- an alternateLoggerto be used instead of the default one provided by Langchain4J for logging requests and responses.- Returns:
this.
-
customHeaders
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder customHeaders(Map<String, String> customHeaders) -
customQueryParams
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder customQueryParams(Map<String, String> customQueryParams) -
listeners
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder listeners(List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners) -
build
-