Package dev.langchain4j.model.openai
Class OpenAiChatModel.OpenAiChatModelBuilder
java.lang.Object
dev.langchain4j.model.openai.OpenAiChatModel.OpenAiChatModelBuilder
- Enclosing class:
OpenAiChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()customHeaders(Map<String, String> customHeaders) defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters parameters) Sets default commonChatRequestParametersor OpenAI-specificOpenAiChatRequestParameters.frequencyPenalty(Double frequencyPenalty) httpClientBuilder(HttpClientBuilder httpClientBuilder) logger(org.slf4j.Logger logger) logRequests(Boolean logRequests) logResponses(Boolean logResponses) maxCompletionTokens(Integer maxCompletionTokens) maxRetries(Integer maxRetries) modelName(OpenAiChatModelName modelName) organizationId(String organizationId) parallelToolCalls(Boolean parallelToolCalls) presencePenalty(Double presencePenalty) responseFormat(dev.langchain4j.model.chat.request.ResponseFormat responseFormat) responseFormat(String responseFormat) returnThinking(Boolean returnThinking) This setting is intended for DeepSeek.serviceTier(String serviceTier) strictJsonSchema(Boolean strictJsonSchema) strictTools(Boolean strictTools) supportedCapabilities(dev.langchain4j.model.chat.Capability... supportedCapabilities) supportedCapabilities(Set<dev.langchain4j.model.chat.Capability> supportedCapabilities) temperature(Double temperature)
-
Constructor Details
-
OpenAiChatModelBuilder
public OpenAiChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public OpenAiChatModel.OpenAiChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
defaultRequestParameters
public OpenAiChatModel.OpenAiChatModelBuilder defaultRequestParameters(dev.langchain4j.model.chat.request.ChatRequestParameters parameters) Sets default commonChatRequestParametersor OpenAI-specificOpenAiChatRequestParameters.
When a parameter is set via an individual builder method (e.g.,modelName(String)), its value takes precedence over the same parameter set viaChatRequestParameters. -
modelName
-
modelName
-
baseUrl
-
apiKey
-
organizationId
-
projectId
-
temperature
-
topP
-
stop
-
maxTokens
-
maxCompletionTokens
-
presencePenalty
-
frequencyPenalty
-
logitBias
-
responseFormat
public OpenAiChatModel.OpenAiChatModelBuilder responseFormat(dev.langchain4j.model.chat.request.ResponseFormat responseFormat) -
responseFormat
- See Also:
-
supportedCapabilities
public OpenAiChatModel.OpenAiChatModelBuilder supportedCapabilities(Set<dev.langchain4j.model.chat.Capability> supportedCapabilities) -
supportedCapabilities
public OpenAiChatModel.OpenAiChatModelBuilder supportedCapabilities(dev.langchain4j.model.chat.Capability... supportedCapabilities) -
strictJsonSchema
-
seed
-
user
-
strictTools
-
parallelToolCalls
-
store
-
metadata
-
serviceTier
-
returnThinking
This setting is intended for DeepSeek.Controls whether to return thinking/reasoning text (if available) inside
AiMessage.thinking(). Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thereasoning_contentfield from the API response and return it inside theAiMessage.Disabled by default. If enabled, the thinking text will be stored within the
AiMessageand may be persisted. -
timeout
-
maxRetries
-
logRequests
-
logResponses
-
logger
- Parameters:
logger- an alternateLoggerto be used instead of the default one provided by Langchain4J for logging requests and responses.- Returns:
this.
-
customHeaders
-
listeners
public OpenAiChatModel.OpenAiChatModelBuilder listeners(List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners) -
build
-