Class ChatCompletionCreateParams.Builder
-
- All Implemented Interfaces:
public final class ChatCompletionCreateParams.Builder
A builder for ChatCompletionCreateParams.
-
-
Method Summary
Modifier and Type Method Description final ChatCompletionCreateParams.Builder
messages(List<ChatCompletionMessageParam> messages)
A list of messages comprising the conversation so far. final ChatCompletionCreateParams.Builder
messages(JsonField<List<ChatCompletionMessageParam>> messages)
A list of messages comprising the conversation so far. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionMessageParam message)
A list of messages comprising the conversation so far. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionDeveloperMessageParam developer)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionSystemMessageParam system)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionUserMessageParam user)
Messages sent by an end user, containing prompts or additional context information. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionAssistantMessageParam assistant)
Messages sent by the model in response to user messages. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionMessage assistant)
Messages sent by the model in response to user messages. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionToolMessageParam tool)
A list of messages comprising the conversation so far. final ChatCompletionCreateParams.Builder
addMessage(ChatCompletionFunctionMessageParam function)
A list of messages comprising the conversation so far. final ChatCompletionCreateParams.Builder
addDeveloperMessage(ChatCompletionDeveloperMessageParam.Content content)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.Builder
addDeveloperMessage(String text)
The contents of the developer message. final ChatCompletionCreateParams.Builder
addDeveloperMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)
An array of content parts with a defined type. final ChatCompletionCreateParams.Builder
addSystemMessage(ChatCompletionSystemMessageParam.Content content)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.Builder
addSystemMessage(String text)
The contents of the system message. final ChatCompletionCreateParams.Builder
addSystemMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)
An array of content parts with a defined type. final ChatCompletionCreateParams.Builder
addUserMessage(ChatCompletionUserMessageParam.Content content)
Messages sent by an end user, containing prompts or additional context information. final ChatCompletionCreateParams.Builder
addUserMessage(String text)
The text contents of the message. final ChatCompletionCreateParams.Builder
addUserMessageOfArrayOfContentParts(List<ChatCompletionContentPart> arrayOfContentParts)
An array of content parts with a defined type. final ChatCompletionCreateParams.Builder
model(ChatModel model)
ID of the model to use. final ChatCompletionCreateParams.Builder
model(JsonField<ChatModel> model)
ID of the model to use. final ChatCompletionCreateParams.Builder
model(String value)
ID of the model to use. final ChatCompletionCreateParams.Builder
audio(ChatCompletionAudioParam audio)
Parameters for audio output. final ChatCompletionCreateParams.Builder
audio(Optional<ChatCompletionAudioParam> audio)
Parameters for audio output. final ChatCompletionCreateParams.Builder
audio(JsonField<ChatCompletionAudioParam> audio)
Parameters for audio output. final ChatCompletionCreateParams.Builder
frequencyPenalty(Double frequencyPenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
frequencyPenalty(Double frequencyPenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
frequencyPenalty(Optional<Double> frequencyPenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
frequencyPenalty(JsonField<Double> frequencyPenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
functionCall(ChatCompletionCreateParams.FunctionCall functionCall)
Deprecated in favor of tool_choice
.final ChatCompletionCreateParams.Builder
functionCall(JsonField<ChatCompletionCreateParams.FunctionCall> functionCall)
Deprecated in favor of tool_choice
.final ChatCompletionCreateParams.Builder
functionCall(ChatCompletionCreateParams.FunctionCall.Auto auto)
none
means the model will not call a function and instead generates a message.final ChatCompletionCreateParams.Builder
functionCall(ChatCompletionFunctionCallOption functionCallOption)
Specifying a particular function via {"name": "my_function"}
forces the model to call that function.final ChatCompletionCreateParams.Builder
functions(List<ChatCompletionCreateParams.Function> functions)
Deprecated in favor of tools
.final ChatCompletionCreateParams.Builder
functions(JsonField<List<ChatCompletionCreateParams.Function>> functions)
Deprecated in favor of tools
.final ChatCompletionCreateParams.Builder
addFunction(ChatCompletionCreateParams.Function function)
Deprecated in favor of tools
.final ChatCompletionCreateParams.Builder
logitBias(ChatCompletionCreateParams.LogitBias logitBias)
Modify the likelihood of specified tokens appearing in the completion. final ChatCompletionCreateParams.Builder
logitBias(Optional<ChatCompletionCreateParams.LogitBias> logitBias)
Modify the likelihood of specified tokens appearing in the completion. final ChatCompletionCreateParams.Builder
logitBias(JsonField<ChatCompletionCreateParams.LogitBias> logitBias)
Modify the likelihood of specified tokens appearing in the completion. final ChatCompletionCreateParams.Builder
logprobs(Boolean logprobs)
Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builder
logprobs(Boolean logprobs)
Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builder
logprobs(Optional<Boolean> logprobs)
Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builder
logprobs(JsonField<Boolean> logprobs)
Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builder
maxCompletionTokens(Long maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.Builder
maxCompletionTokens(Long maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.Builder
maxCompletionTokens(Optional<Long> maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.Builder
maxCompletionTokens(JsonField<Long> maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.Builder
maxTokens(Long maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.Builder
maxTokens(Long maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.Builder
maxTokens(Optional<Long> maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.Builder
maxTokens(JsonField<Long> maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.Builder
metadata(Metadata metadata)
Set of 16 key-value pairs that can be attached to an object. final ChatCompletionCreateParams.Builder
metadata(Optional<Metadata> metadata)
Set of 16 key-value pairs that can be attached to an object. final ChatCompletionCreateParams.Builder
metadata(JsonField<Metadata> metadata)
Set of 16 key-value pairs that can be attached to an object. final ChatCompletionCreateParams.Builder
modalities(List<ChatCompletionModality> modalities)
Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Builder
modalities(Optional<List<ChatCompletionModality>> modalities)
Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Builder
modalities(JsonField<List<ChatCompletionModality>> modalities)
Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Builder
addModality(ChatCompletionModality modality)
Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Builder
n(Long n)
How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Builder
n(Long n)
How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Builder
n(Optional<Long> n)
How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Builder
n(JsonField<Long> n)
How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Builder
parallelToolCalls(Boolean parallelToolCalls)
Whether to enable parallel function calling during tool use. final ChatCompletionCreateParams.Builder
parallelToolCalls(JsonField<Boolean> parallelToolCalls)
Whether to enable parallel function calling during tool use. final ChatCompletionCreateParams.Builder
prediction(ChatCompletionPredictionContent prediction)
Static predicted output content, such as the content of a text file that is being regenerated. final ChatCompletionCreateParams.Builder
prediction(Optional<ChatCompletionPredictionContent> prediction)
Static predicted output content, such as the content of a text file that is being regenerated. final ChatCompletionCreateParams.Builder
prediction(JsonField<ChatCompletionPredictionContent> prediction)
Static predicted output content, such as the content of a text file that is being regenerated. final ChatCompletionCreateParams.Builder
presencePenalty(Double presencePenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
presencePenalty(Double presencePenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
presencePenalty(Optional<Double> presencePenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
presencePenalty(JsonField<Double> presencePenalty)
Number between -2.0 and 2.0. final ChatCompletionCreateParams.Builder
reasoningEffort(ChatCompletionReasoningEffort reasoningEffort)
o1 and o3-mini models onlyConstrains effort on reasoning for reasoning models. final ChatCompletionCreateParams.Builder
reasoningEffort(Optional<ChatCompletionReasoningEffort> reasoningEffort)
o1 and o3-mini models onlyConstrains effort on reasoning for reasoning models. final ChatCompletionCreateParams.Builder
reasoningEffort(JsonField<ChatCompletionReasoningEffort> reasoningEffort)
o1 and o3-mini models onlyConstrains effort on reasoning for reasoning models. final ChatCompletionCreateParams.Builder
responseFormat(ChatCompletionCreateParams.ResponseFormat responseFormat)
An object specifying the format that the model must output. final ChatCompletionCreateParams.Builder
responseFormat(JsonField<ChatCompletionCreateParams.ResponseFormat> responseFormat)
An object specifying the format that the model must output. final ChatCompletionCreateParams.Builder
responseFormat(ResponseFormatText text)
An object specifying the format that the model must output. final ChatCompletionCreateParams.Builder
responseFormat(ResponseFormatJsonObject jsonObject)
An object specifying the format that the model must output. final ChatCompletionCreateParams.Builder
responseFormat(ResponseFormatJsonSchema jsonSchema)
An object specifying the format that the model must output. final ChatCompletionCreateParams.Builder
seed(Long seed)
This feature is in Beta. final ChatCompletionCreateParams.Builder
seed(Long seed)
This feature is in Beta. final ChatCompletionCreateParams.Builder
seed(Optional<Long> seed)
This feature is in Beta. final ChatCompletionCreateParams.Builder
seed(JsonField<Long> seed)
This feature is in Beta. final ChatCompletionCreateParams.Builder
serviceTier(ChatCompletionCreateParams.ServiceTier serviceTier)
Specifies the latency tier to use for processing the request. final ChatCompletionCreateParams.Builder
serviceTier(Optional<ChatCompletionCreateParams.ServiceTier> serviceTier)
Specifies the latency tier to use for processing the request. final ChatCompletionCreateParams.Builder
serviceTier(JsonField<ChatCompletionCreateParams.ServiceTier> serviceTier)
Specifies the latency tier to use for processing the request. final ChatCompletionCreateParams.Builder
stop(ChatCompletionCreateParams.Stop stop)
Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builder
stop(JsonField<ChatCompletionCreateParams.Stop> stop)
Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builder
stop(String string)
Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builder
stopOfStrings(List<String> strings)
Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builder
store(Boolean store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builder
store(Boolean store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builder
store(Optional<Boolean> store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builder
store(JsonField<Boolean> store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builder
streamOptions(ChatCompletionStreamOptions streamOptions)
Options for streaming response. final ChatCompletionCreateParams.Builder
streamOptions(Optional<ChatCompletionStreamOptions> streamOptions)
Options for streaming response. final ChatCompletionCreateParams.Builder
streamOptions(JsonField<ChatCompletionStreamOptions> streamOptions)
Options for streaming response. final ChatCompletionCreateParams.Builder
temperature(Double temperature)
What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Builder
temperature(Double temperature)
What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Builder
temperature(Optional<Double> temperature)
What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Builder
temperature(JsonField<Double> temperature)
What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Builder
toolChoice(ChatCompletionToolChoiceOption toolChoice)
Controls which (if any) tool is called by the model. final ChatCompletionCreateParams.Builder
toolChoice(JsonField<ChatCompletionToolChoiceOption> toolChoice)
Controls which (if any) tool is called by the model. final ChatCompletionCreateParams.Builder
toolChoice(ChatCompletionToolChoiceOption.Auto auto)
none
means the model will not call any tool and instead generates a message.final ChatCompletionCreateParams.Builder
toolChoice(ChatCompletionNamedToolChoice namedToolChoice)
Specifies a tool the model should use. final ChatCompletionCreateParams.Builder
tools(List<ChatCompletionTool> tools)
A list of tools the model may call. final ChatCompletionCreateParams.Builder
tools(JsonField<List<ChatCompletionTool>> tools)
A list of tools the model may call. final ChatCompletionCreateParams.Builder
addTool(ChatCompletionTool tool)
A list of tools the model may call. final ChatCompletionCreateParams.Builder
topLogprobs(Long topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.Builder
topLogprobs(Long topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.Builder
topLogprobs(Optional<Long> topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.Builder
topLogprobs(JsonField<Long> topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.Builder
topP(Double topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.Builder
topP(Double topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.Builder
topP(Optional<Double> topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.Builder
topP(JsonField<Double> topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.Builder
user(String user)
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. final ChatCompletionCreateParams.Builder
user(JsonField<String> user)
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. final ChatCompletionCreateParams.Builder
additionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)
final ChatCompletionCreateParams.Builder
putAdditionalBodyProperty(String key, JsonValue value)
final ChatCompletionCreateParams.Builder
putAllAdditionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)
final ChatCompletionCreateParams.Builder
removeAdditionalBodyProperty(String key)
final ChatCompletionCreateParams.Builder
removeAllAdditionalBodyProperties(Set<String> keys)
final ChatCompletionCreateParams.Builder
additionalHeaders(Headers additionalHeaders)
final ChatCompletionCreateParams.Builder
additionalHeaders(Map<String, Iterable<String>> additionalHeaders)
final ChatCompletionCreateParams.Builder
putAdditionalHeader(String name, String value)
final ChatCompletionCreateParams.Builder
putAdditionalHeaders(String name, Iterable<String> values)
final ChatCompletionCreateParams.Builder
putAllAdditionalHeaders(Headers additionalHeaders)
final ChatCompletionCreateParams.Builder
putAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)
final ChatCompletionCreateParams.Builder
replaceAdditionalHeaders(String name, String value)
final ChatCompletionCreateParams.Builder
replaceAdditionalHeaders(String name, Iterable<String> values)
final ChatCompletionCreateParams.Builder
replaceAllAdditionalHeaders(Headers additionalHeaders)
final ChatCompletionCreateParams.Builder
replaceAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)
final ChatCompletionCreateParams.Builder
removeAdditionalHeaders(String name)
final ChatCompletionCreateParams.Builder
removeAllAdditionalHeaders(Set<String> names)
final ChatCompletionCreateParams.Builder
additionalQueryParams(QueryParams additionalQueryParams)
final ChatCompletionCreateParams.Builder
additionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
final ChatCompletionCreateParams.Builder
putAdditionalQueryParam(String key, String value)
final ChatCompletionCreateParams.Builder
putAdditionalQueryParams(String key, Iterable<String> values)
final ChatCompletionCreateParams.Builder
putAllAdditionalQueryParams(QueryParams additionalQueryParams)
final ChatCompletionCreateParams.Builder
putAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
final ChatCompletionCreateParams.Builder
replaceAdditionalQueryParams(String key, String value)
final ChatCompletionCreateParams.Builder
replaceAdditionalQueryParams(String key, Iterable<String> values)
final ChatCompletionCreateParams.Builder
replaceAllAdditionalQueryParams(QueryParams additionalQueryParams)
final ChatCompletionCreateParams.Builder
replaceAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
final ChatCompletionCreateParams.Builder
removeAdditionalQueryParams(String key)
final ChatCompletionCreateParams.Builder
removeAllAdditionalQueryParams(Set<String> keys)
final ChatCompletionCreateParams
build()
-
-
Method Detail
-
messages
final ChatCompletionCreateParams.Builder messages(List<ChatCompletionMessageParam> messages)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
messages
final ChatCompletionCreateParams.Builder messages(JsonField<List<ChatCompletionMessageParam>> messages)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionMessageParam message)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionDeveloperMessageParam developer)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer,
developer
messages replace the previoussystem
messages.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionSystemMessageParam system)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer, use
developer
messages for this purpose instead.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionUserMessageParam user)
Messages sent by an end user, containing prompts or additional context information.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionAssistantMessageParam assistant)
Messages sent by the model in response to user messages.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionMessage assistant)
Messages sent by the model in response to user messages.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionToolMessageParam tool)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addMessage
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder addMessage(ChatCompletionFunctionMessageParam function)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addDeveloperMessage
final ChatCompletionCreateParams.Builder addDeveloperMessage(ChatCompletionDeveloperMessageParam.Content content)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer,
developer
messages replace the previoussystem
messages.
-
addDeveloperMessage
final ChatCompletionCreateParams.Builder addDeveloperMessage(String text)
The contents of the developer message.
-
addDeveloperMessageOfArrayOfContentParts
final ChatCompletionCreateParams.Builder addDeveloperMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)
An array of content parts with a defined type. For developer messages, only type
text
is supported.
-
addSystemMessage
final ChatCompletionCreateParams.Builder addSystemMessage(ChatCompletionSystemMessageParam.Content content)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer, use
developer
messages for this purpose instead.
-
addSystemMessage
final ChatCompletionCreateParams.Builder addSystemMessage(String text)
The contents of the system message.
-
addSystemMessageOfArrayOfContentParts
final ChatCompletionCreateParams.Builder addSystemMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)
An array of content parts with a defined type. For system messages, only type
text
is supported.
-
addUserMessage
final ChatCompletionCreateParams.Builder addUserMessage(ChatCompletionUserMessageParam.Content content)
Messages sent by an end user, containing prompts or additional context information.
-
addUserMessage
final ChatCompletionCreateParams.Builder addUserMessage(String text)
The text contents of the message.
-
addUserMessageOfArrayOfContentParts
final ChatCompletionCreateParams.Builder addUserMessageOfArrayOfContentParts(List<ChatCompletionContentPart> arrayOfContentParts)
An array of content parts with a defined type. Supported options differ based on the model being used to generate the response. Can contain text, image, or audio inputs.
-
model
final ChatCompletionCreateParams.Builder model(ChatModel model)
ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
-
model
final ChatCompletionCreateParams.Builder model(JsonField<ChatModel> model)
ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
-
model
final ChatCompletionCreateParams.Builder model(String value)
ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
-
audio
final ChatCompletionCreateParams.Builder audio(ChatCompletionAudioParam audio)
Parameters for audio output. Required when audio output is requested with
modalities: ["audio"]
. Learn more.
-
audio
final ChatCompletionCreateParams.Builder audio(Optional<ChatCompletionAudioParam> audio)
Parameters for audio output. Required when audio output is requested with
modalities: ["audio"]
. Learn more.
-
audio
final ChatCompletionCreateParams.Builder audio(JsonField<ChatCompletionAudioParam> audio)
Parameters for audio output. Required when audio output is requested with
modalities: ["audio"]
. Learn more.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(Double frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(Double frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(Optional<Double> frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(JsonField<Double> frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(ChatCompletionCreateParams.FunctionCall functionCall)
Deprecated in favor of
tool_choice
.Controls which (if any) function is called by the model.
none
means the model will not call a function and instead generates a message.auto
means the model can pick between generating a message or calling a function.Specifying a particular function via
{"name": "my_function"}
forces the model to call that function.none
is the default when no functions are present.auto
is the default if functions are present.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(JsonField<ChatCompletionCreateParams.FunctionCall> functionCall)
Deprecated in favor of
tool_choice
.Controls which (if any) function is called by the model.
none
means the model will not call a function and instead generates a message.auto
means the model can pick between generating a message or calling a function.Specifying a particular function via
{"name": "my_function"}
forces the model to call that function.none
is the default when no functions are present.auto
is the default if functions are present.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(ChatCompletionCreateParams.FunctionCall.Auto auto)
none
means the model will not call a function and instead generates a message.auto
means the model can pick between generating a message or calling a function.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(ChatCompletionFunctionCallOption functionCallOption)
Specifying a particular function via
{"name": "my_function"}
forces the model to call that function.
-
functions
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functions(List<ChatCompletionCreateParams.Function> functions)
Deprecated in favor of
tools
.A list of functions the model may generate JSON inputs for.
-
functions
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functions(JsonField<List<ChatCompletionCreateParams.Function>> functions)
Deprecated in favor of
tools
.A list of functions the model may generate JSON inputs for.
-
addFunction
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder addFunction(ChatCompletionCreateParams.Function function)
Deprecated in favor of
tools
.A list of functions the model may generate JSON inputs for.
-
logitBias
final ChatCompletionCreateParams.Builder logitBias(ChatCompletionCreateParams.LogitBias logitBias)
Modify the likelihood of specified tokens appearing in the completion.
Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
-
logitBias
final ChatCompletionCreateParams.Builder logitBias(Optional<ChatCompletionCreateParams.LogitBias> logitBias)
Modify the likelihood of specified tokens appearing in the completion.
Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
-
logitBias
final ChatCompletionCreateParams.Builder logitBias(JsonField<ChatCompletionCreateParams.LogitBias> logitBias)
Modify the likelihood of specified tokens appearing in the completion.
Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(Boolean logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
content
ofmessage
.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(Boolean logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
content
ofmessage
.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(Optional<Boolean> logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
content
ofmessage
.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(JsonField<Boolean> logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
content
ofmessage
.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(Long maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(Long maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(Optional<Long> maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(JsonField<Long> maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(Long maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens
, and is not compatible with o1 series models.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(Long maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens
, and is not compatible with o1 series models.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(Optional<Long> maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens
, and is not compatible with o1 series models.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(JsonField<Long> maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens
, and is not compatible with o1 series models.
-
metadata
final ChatCompletionCreateParams.Builder metadata(Metadata metadata)
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
-
metadata
final ChatCompletionCreateParams.Builder metadata(Optional<Metadata> metadata)
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
-
metadata
final ChatCompletionCreateParams.Builder metadata(JsonField<Metadata> metadata)
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
-
modalities
final ChatCompletionCreateParams.Builder modalities(List<ChatCompletionModality> modalities)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]
The
gpt-4o-audio-preview
model can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
modalities
final ChatCompletionCreateParams.Builder modalities(Optional<List<ChatCompletionModality>> modalities)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]
The
gpt-4o-audio-preview
model can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
modalities
final ChatCompletionCreateParams.Builder modalities(JsonField<List<ChatCompletionModality>> modalities)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]
The
gpt-4o-audio-preview
model can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
addModality
final ChatCompletionCreateParams.Builder addModality(ChatCompletionModality modality)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]
The
gpt-4o-audio-preview
model can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
n
final ChatCompletionCreateParams.Builder n(Long n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
n
as1
to minimize costs.
-
n
final ChatCompletionCreateParams.Builder n(Long n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
n
as1
to minimize costs.
-
n
final ChatCompletionCreateParams.Builder n(Optional<Long> n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
n
as1
to minimize costs.
-
n
final ChatCompletionCreateParams.Builder n(JsonField<Long> n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
n
as1
to minimize costs.
-
parallelToolCalls
final ChatCompletionCreateParams.Builder parallelToolCalls(Boolean parallelToolCalls)
Whether to enable parallel function calling during tool use.
-
parallelToolCalls
final ChatCompletionCreateParams.Builder parallelToolCalls(JsonField<Boolean> parallelToolCalls)
Whether to enable parallel function calling during tool use.
-
prediction
final ChatCompletionCreateParams.Builder prediction(ChatCompletionPredictionContent prediction)
Static predicted output content, such as the content of a text file that is being regenerated.
-
prediction
final ChatCompletionCreateParams.Builder prediction(Optional<ChatCompletionPredictionContent> prediction)
Static predicted output content, such as the content of a text file that is being regenerated.
-
prediction
final ChatCompletionCreateParams.Builder prediction(JsonField<ChatCompletionPredictionContent> prediction)
Static predicted output content, such as the content of a text file that is being regenerated.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(Double presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(Double presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(Optional<Double> presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(JsonField<Double> presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
reasoningEffort
final ChatCompletionCreateParams.Builder reasoningEffort(ChatCompletionReasoningEffort reasoningEffort)
o1 and o3-mini models only
Constrains effort on reasoning for reasoning models. Currently supported values are
low
,medium
, andhigh
. Reducing reasoning effort can result in faster responses and fewer tokens used on reasoning in a response.
-
reasoningEffort
final ChatCompletionCreateParams.Builder reasoningEffort(Optional<ChatCompletionReasoningEffort> reasoningEffort)
o1 and o3-mini models only
Constrains effort on reasoning for reasoning models. Currently supported values are
low
,medium
, andhigh
. Reducing reasoning effort can result in faster responses and fewer tokens used on reasoning in a response.
-
reasoningEffort
final ChatCompletionCreateParams.Builder reasoningEffort(JsonField<ChatCompletionReasoningEffort> reasoningEffort)
o1 and o3-mini models only
Constrains effort on reasoning for reasoning models. Currently supported values are
low
,medium
, andhigh
. Reducing reasoning effort can result in faster responses and fewer tokens used on reasoning in a response.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ChatCompletionCreateParams.ResponseFormat responseFormat)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }
enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }
enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length"
, which indicates the generation exceededmax_tokens
or the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(JsonField<ChatCompletionCreateParams.ResponseFormat> responseFormat)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }
enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }
enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length"
, which indicates the generation exceededmax_tokens
or the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ResponseFormatText text)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }
enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }
enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length"
, which indicates the generation exceededmax_tokens
or the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ResponseFormatJsonObject jsonObject)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }
enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }
enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length"
, which indicates the generation exceededmax_tokens
or the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ResponseFormatJsonSchema jsonSchema)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }
enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }
enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length"
, which indicates the generation exceededmax_tokens
or the conversation exceeded the max context length.
-
seed
final ChatCompletionCreateParams.Builder seed(Long seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seed
and parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprint
response parameter to monitor changes in the backend.
-
seed
final ChatCompletionCreateParams.Builder seed(Long seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seed
and parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprint
response parameter to monitor changes in the backend.
-
seed
final ChatCompletionCreateParams.Builder seed(Optional<Long> seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seed
and parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprint
response parameter to monitor changes in the backend.
-
seed
final ChatCompletionCreateParams.Builder seed(JsonField<Long> seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seed
and parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprint
response parameter to monitor changes in the backend.
-
serviceTier
final ChatCompletionCreateParams.Builder serviceTier(ChatCompletionCreateParams.ServiceTier serviceTier)
Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service:
If set to 'auto', and the Project is Scale tier enabled, the system will utilize scale tier credits until they are exhausted.
If set to 'auto', and the Project is not Scale tier enabled, the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
When not set, the default behavior is 'auto'.
-
serviceTier
final ChatCompletionCreateParams.Builder serviceTier(Optional<ChatCompletionCreateParams.ServiceTier> serviceTier)
Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service:
If set to 'auto', and the Project is Scale tier enabled, the system will utilize scale tier credits until they are exhausted.
If set to 'auto', and the Project is not Scale tier enabled, the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
When not set, the default behavior is 'auto'.
-
serviceTier
final ChatCompletionCreateParams.Builder serviceTier(JsonField<ChatCompletionCreateParams.ServiceTier> serviceTier)
Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service:
If set to 'auto', and the Project is Scale tier enabled, the system will utilize scale tier credits until they are exhausted.
If set to 'auto', and the Project is not Scale tier enabled, the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
When not set, the default behavior is 'auto'.
-
stop
final ChatCompletionCreateParams.Builder stop(ChatCompletionCreateParams.Stop stop)
Up to 4 sequences where the API will stop generating further tokens.
-
stop
final ChatCompletionCreateParams.Builder stop(JsonField<ChatCompletionCreateParams.Stop> stop)
Up to 4 sequences where the API will stop generating further tokens.
-
stop
final ChatCompletionCreateParams.Builder stop(String string)
Up to 4 sequences where the API will stop generating further tokens.
-
stopOfStrings
final ChatCompletionCreateParams.Builder stopOfStrings(List<String> strings)
Up to 4 sequences where the API will stop generating further tokens.
-
store
final ChatCompletionCreateParams.Builder store(Boolean store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
store
final ChatCompletionCreateParams.Builder store(Boolean store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
store
final ChatCompletionCreateParams.Builder store(Optional<Boolean> store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
store
final ChatCompletionCreateParams.Builder store(JsonField<Boolean> store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
streamOptions
final ChatCompletionCreateParams.Builder streamOptions(ChatCompletionStreamOptions streamOptions)
Options for streaming response. Only set this when you set
stream: true
.
-
streamOptions
final ChatCompletionCreateParams.Builder streamOptions(Optional<ChatCompletionStreamOptions> streamOptions)
Options for streaming response. Only set this when you set
stream: true
.
-
streamOptions
final ChatCompletionCreateParams.Builder streamOptions(JsonField<ChatCompletionStreamOptions> streamOptions)
Options for streaming response. Only set this when you set
stream: true
.
-
temperature
final ChatCompletionCreateParams.Builder temperature(Double temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_p
but not both.
-
temperature
final ChatCompletionCreateParams.Builder temperature(Double temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_p
but not both.
-
temperature
final ChatCompletionCreateParams.Builder temperature(Optional<Double> temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_p
but not both.
-
temperature
final ChatCompletionCreateParams.Builder temperature(JsonField<Double> temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_p
but not both.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(ChatCompletionToolChoiceOption toolChoice)
Controls which (if any) tool is called by the model.
none
means the model will not call any tool and instead generates a message.auto
means the model can pick between generating a message or calling one or more tools.required
means the model must call one or more tools. Specifying a particular tool via{"type": "function", "function": {"name": "my_function"}}
forces the model to call that tool.none
is the default when no tools are present.auto
is the default if tools are present.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(JsonField<ChatCompletionToolChoiceOption> toolChoice)
Controls which (if any) tool is called by the model.
none
means the model will not call any tool and instead generates a message.auto
means the model can pick between generating a message or calling one or more tools.required
means the model must call one or more tools. Specifying a particular tool via{"type": "function", "function": {"name": "my_function"}}
forces the model to call that tool.none
is the default when no tools are present.auto
is the default if tools are present.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(ChatCompletionToolChoiceOption.Auto auto)
none
means the model will not call any tool and instead generates a message.auto
means the model can pick between generating a message or calling one or more tools.required
means the model must call one or more tools.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(ChatCompletionNamedToolChoice namedToolChoice)
Specifies a tool the model should use. Use to force the model to call a specific function.
-
tools
final ChatCompletionCreateParams.Builder tools(List<ChatCompletionTool> tools)
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
-
tools
final ChatCompletionCreateParams.Builder tools(JsonField<List<ChatCompletionTool>> tools)
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
-
addTool
final ChatCompletionCreateParams.Builder addTool(ChatCompletionTool tool)
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(Long topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobs
must be set totrue
if this parameter is used.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(Long topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobs
must be set totrue
if this parameter is used.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(Optional<Long> topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobs
must be set totrue
if this parameter is used.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(JsonField<Long> topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobs
must be set totrue
if this parameter is used.
-
topP
final ChatCompletionCreateParams.Builder topP(Double topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperature
but not both.
-
topP
final ChatCompletionCreateParams.Builder topP(Double topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperature
but not both.
-
topP
final ChatCompletionCreateParams.Builder topP(Optional<Double> topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperature
but not both.
-
topP
final ChatCompletionCreateParams.Builder topP(JsonField<Double> topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperature
but not both.
-
user
final ChatCompletionCreateParams.Builder user(String user)
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. Learn more.
-
user
final ChatCompletionCreateParams.Builder user(JsonField<String> user)
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. Learn more.
-
additionalBodyProperties
final ChatCompletionCreateParams.Builder additionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)
-
putAdditionalBodyProperty
final ChatCompletionCreateParams.Builder putAdditionalBodyProperty(String key, JsonValue value)
-
putAllAdditionalBodyProperties
final ChatCompletionCreateParams.Builder putAllAdditionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)
-
removeAdditionalBodyProperty
final ChatCompletionCreateParams.Builder removeAdditionalBodyProperty(String key)
-
removeAllAdditionalBodyProperties
final ChatCompletionCreateParams.Builder removeAllAdditionalBodyProperties(Set<String> keys)
-
additionalHeaders
final ChatCompletionCreateParams.Builder additionalHeaders(Headers additionalHeaders)
-
additionalHeaders
final ChatCompletionCreateParams.Builder additionalHeaders(Map<String, Iterable<String>> additionalHeaders)
-
putAdditionalHeader
final ChatCompletionCreateParams.Builder putAdditionalHeader(String name, String value)
-
putAdditionalHeaders
final ChatCompletionCreateParams.Builder putAdditionalHeaders(String name, Iterable<String> values)
-
putAllAdditionalHeaders
final ChatCompletionCreateParams.Builder putAllAdditionalHeaders(Headers additionalHeaders)
-
putAllAdditionalHeaders
final ChatCompletionCreateParams.Builder putAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)
-
replaceAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAdditionalHeaders(String name, String value)
-
replaceAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAdditionalHeaders(String name, Iterable<String> values)
-
replaceAllAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAllAdditionalHeaders(Headers additionalHeaders)
-
replaceAllAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)
-
removeAdditionalHeaders
final ChatCompletionCreateParams.Builder removeAdditionalHeaders(String name)
-
removeAllAdditionalHeaders
final ChatCompletionCreateParams.Builder removeAllAdditionalHeaders(Set<String> names)
-
additionalQueryParams
final ChatCompletionCreateParams.Builder additionalQueryParams(QueryParams additionalQueryParams)
-
additionalQueryParams
final ChatCompletionCreateParams.Builder additionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
-
putAdditionalQueryParam
final ChatCompletionCreateParams.Builder putAdditionalQueryParam(String key, String value)
-
putAdditionalQueryParams
final ChatCompletionCreateParams.Builder putAdditionalQueryParams(String key, Iterable<String> values)
-
putAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder putAllAdditionalQueryParams(QueryParams additionalQueryParams)
-
putAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder putAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
-
replaceAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAdditionalQueryParams(String key, String value)
-
replaceAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAdditionalQueryParams(String key, Iterable<String> values)
-
replaceAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAllAdditionalQueryParams(QueryParams additionalQueryParams)
-
replaceAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
-
removeAdditionalQueryParams
final ChatCompletionCreateParams.Builder removeAdditionalQueryParams(String key)
-
removeAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder removeAllAdditionalQueryParams(Set<String> keys)
-
build
final ChatCompletionCreateParams build()
-
-
-
-