Class ChatCompletionCreateParams.Builder
-
- All Implemented Interfaces:
public final class ChatCompletionCreateParams.BuilderA builder for ChatCompletionCreateParams.
-
-
Method Summary
Modifier and Type Method Description final ChatCompletionCreateParams.Buildermessages(List<ChatCompletionMessageParam> messages)A list of messages comprising the conversation so far. final ChatCompletionCreateParams.Buildermessages(JsonField<List<ChatCompletionMessageParam>> messages)A list of messages comprising the conversation so far. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionMessageParam message)A list of messages comprising the conversation so far. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionDeveloperMessageParam developer)Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionSystemMessageParam system)Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionUserMessageParam user)Messages sent by an end user, containing prompts or additional context information. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionAssistantMessageParam assistant)Messages sent by the model in response to user messages. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionMessage assistant)Messages sent by the model in response to user messages. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionToolMessageParam tool)A list of messages comprising the conversation so far. final ChatCompletionCreateParams.BuilderaddMessage(ChatCompletionFunctionMessageParam function)A list of messages comprising the conversation so far. final ChatCompletionCreateParams.BuilderaddDeveloperMessage(ChatCompletionDeveloperMessageParam.Content content)Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.BuilderaddDeveloperMessage(String text)The contents of the developer message. final ChatCompletionCreateParams.BuilderaddDeveloperMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)An array of content parts with a defined type. final ChatCompletionCreateParams.BuilderaddSystemMessage(ChatCompletionSystemMessageParam.Content content)Developer-provided instructions that the model should follow, regardless of messages sent by the user. final ChatCompletionCreateParams.BuilderaddSystemMessage(String text)The contents of the system message. final ChatCompletionCreateParams.BuilderaddSystemMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)An array of content parts with a defined type. final ChatCompletionCreateParams.BuilderaddUserMessage(ChatCompletionUserMessageParam.Content content)Messages sent by an end user, containing prompts or additional context information. final ChatCompletionCreateParams.BuilderaddUserMessage(String text)The text contents of the message. final ChatCompletionCreateParams.BuilderaddUserMessageOfArrayOfContentParts(List<ChatCompletionContentPart> arrayOfContentParts)An array of content parts with a defined type. final ChatCompletionCreateParams.Buildermodel(ChatModel model)ID of the model to use. final ChatCompletionCreateParams.Buildermodel(JsonField<ChatModel> model)ID of the model to use. final ChatCompletionCreateParams.Buildermodel(String value)ID of the model to use. final ChatCompletionCreateParams.Builderaudio(ChatCompletionAudioParam audio)Parameters for audio output. final ChatCompletionCreateParams.Builderaudio(Optional<ChatCompletionAudioParam> audio)Parameters for audio output. final ChatCompletionCreateParams.Builderaudio(JsonField<ChatCompletionAudioParam> audio)Parameters for audio output. final ChatCompletionCreateParams.BuilderfrequencyPenalty(Double frequencyPenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderfrequencyPenalty(Double frequencyPenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderfrequencyPenalty(Optional<Double> frequencyPenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderfrequencyPenalty(JsonField<Double> frequencyPenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderfunctionCall(ChatCompletionCreateParams.FunctionCall functionCall)Deprecated in favor of tool_choice.final ChatCompletionCreateParams.BuilderfunctionCall(JsonField<ChatCompletionCreateParams.FunctionCall> functionCall)Deprecated in favor of tool_choice.final ChatCompletionCreateParams.BuilderfunctionCall(ChatCompletionCreateParams.FunctionCall.Auto auto)nonemeans the model will not call a function and instead generates a message.final ChatCompletionCreateParams.BuilderfunctionCall(ChatCompletionFunctionCallOption functionCallOption)Specifying a particular function via {"name": "my_function"}forces the model to call that function.final ChatCompletionCreateParams.Builderfunctions(List<ChatCompletionCreateParams.Function> functions)Deprecated in favor of tools.final ChatCompletionCreateParams.Builderfunctions(JsonField<List<ChatCompletionCreateParams.Function>> functions)Deprecated in favor of tools.final ChatCompletionCreateParams.BuilderaddFunction(ChatCompletionCreateParams.Function function)Deprecated in favor of tools.final ChatCompletionCreateParams.BuilderlogitBias(ChatCompletionCreateParams.LogitBias logitBias)Modify the likelihood of specified tokens appearing in the completion. final ChatCompletionCreateParams.BuilderlogitBias(Optional<ChatCompletionCreateParams.LogitBias> logitBias)Modify the likelihood of specified tokens appearing in the completion. final ChatCompletionCreateParams.BuilderlogitBias(JsonField<ChatCompletionCreateParams.LogitBias> logitBias)Modify the likelihood of specified tokens appearing in the completion. final ChatCompletionCreateParams.Builderlogprobs(Boolean logprobs)Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builderlogprobs(Boolean logprobs)Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builderlogprobs(Optional<Boolean> logprobs)Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.Builderlogprobs(JsonField<Boolean> logprobs)Whether to return log probabilities of the output tokens or not. final ChatCompletionCreateParams.BuildermaxCompletionTokens(Long maxCompletionTokens)An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.BuildermaxCompletionTokens(Long maxCompletionTokens)An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.BuildermaxCompletionTokens(Optional<Long> maxCompletionTokens)An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.BuildermaxCompletionTokens(JsonField<Long> maxCompletionTokens)An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens. final ChatCompletionCreateParams.BuildermaxTokens(Long maxTokens)The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.BuildermaxTokens(Long maxTokens)The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.BuildermaxTokens(Optional<Long> maxTokens)The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.BuildermaxTokens(JsonField<Long> maxTokens)The maximum number of /tokenizer that can be generated in the chat completion. final ChatCompletionCreateParams.Buildermetadata(Metadata metadata)Set of 16 key-value pairs that can be attached to an object. final ChatCompletionCreateParams.Buildermetadata(Optional<Metadata> metadata)Set of 16 key-value pairs that can be attached to an object. final ChatCompletionCreateParams.Buildermetadata(JsonField<Metadata> metadata)Set of 16 key-value pairs that can be attached to an object. final ChatCompletionCreateParams.Buildermodalities(List<ChatCompletionModality> modalities)Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Buildermodalities(Optional<List<ChatCompletionModality>> modalities)Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Buildermodalities(JsonField<List<ChatCompletionModality>> modalities)Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.BuilderaddModality(ChatCompletionModality modality)Output types that you would like the model to generate for this request. final ChatCompletionCreateParams.Buildern(Long n)How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Buildern(Long n)How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Buildern(Optional<Long> n)How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.Buildern(JsonField<Long> n)How many chat completion choices to generate for each input message. final ChatCompletionCreateParams.BuilderparallelToolCalls(Boolean parallelToolCalls)Whether to enable parallel function calling during tool use. final ChatCompletionCreateParams.BuilderparallelToolCalls(JsonField<Boolean> parallelToolCalls)Whether to enable parallel function calling during tool use. final ChatCompletionCreateParams.Builderprediction(ChatCompletionPredictionContent prediction)Static predicted output content, such as the content of a text file that is being regenerated. final ChatCompletionCreateParams.Builderprediction(Optional<ChatCompletionPredictionContent> prediction)Static predicted output content, such as the content of a text file that is being regenerated. final ChatCompletionCreateParams.Builderprediction(JsonField<ChatCompletionPredictionContent> prediction)Static predicted output content, such as the content of a text file that is being regenerated. final ChatCompletionCreateParams.BuilderpresencePenalty(Double presencePenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderpresencePenalty(Double presencePenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderpresencePenalty(Optional<Double> presencePenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderpresencePenalty(JsonField<Double> presencePenalty)Number between -2.0 and 2.0. final ChatCompletionCreateParams.BuilderreasoningEffort(ChatCompletionReasoningEffort reasoningEffort)o1 models onlyConstrains effort on reasoning for reasoning models. final ChatCompletionCreateParams.BuilderreasoningEffort(JsonField<ChatCompletionReasoningEffort> reasoningEffort)o1 models onlyConstrains effort on reasoning for reasoning models. final ChatCompletionCreateParams.BuilderresponseFormat(ChatCompletionCreateParams.ResponseFormat responseFormat)An object specifying the format that the model must output. final ChatCompletionCreateParams.BuilderresponseFormat(JsonField<ChatCompletionCreateParams.ResponseFormat> responseFormat)An object specifying the format that the model must output. final ChatCompletionCreateParams.BuilderresponseFormat(ResponseFormatText text)An object specifying the format that the model must output. final ChatCompletionCreateParams.BuilderresponseFormat(ResponseFormatJsonObject jsonObject)An object specifying the format that the model must output. final ChatCompletionCreateParams.BuilderresponseFormat(ResponseFormatJsonSchema jsonSchema)An object specifying the format that the model must output. final ChatCompletionCreateParams.Builderseed(Long seed)This feature is in Beta. final ChatCompletionCreateParams.Builderseed(Long seed)This feature is in Beta. final ChatCompletionCreateParams.Builderseed(Optional<Long> seed)This feature is in Beta. final ChatCompletionCreateParams.Builderseed(JsonField<Long> seed)This feature is in Beta. final ChatCompletionCreateParams.BuilderserviceTier(ChatCompletionCreateParams.ServiceTier serviceTier)Specifies the latency tier to use for processing the request. final ChatCompletionCreateParams.BuilderserviceTier(Optional<ChatCompletionCreateParams.ServiceTier> serviceTier)Specifies the latency tier to use for processing the request. final ChatCompletionCreateParams.BuilderserviceTier(JsonField<ChatCompletionCreateParams.ServiceTier> serviceTier)Specifies the latency tier to use for processing the request. final ChatCompletionCreateParams.Builderstop(ChatCompletionCreateParams.Stop stop)Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builderstop(JsonField<ChatCompletionCreateParams.Stop> stop)Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builderstop(String string)Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.BuilderstopOfStrings(List<String> strings)Up to 4 sequences where the API will stop generating further tokens. final ChatCompletionCreateParams.Builderstore(Boolean store)Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builderstore(Boolean store)Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builderstore(Optional<Boolean> store)Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.Builderstore(JsonField<Boolean> store)Whether or not to store the output of this chat completion request for use in our model distillation or evals products. final ChatCompletionCreateParams.BuilderstreamOptions(ChatCompletionStreamOptions streamOptions)Options for streaming response. final ChatCompletionCreateParams.BuilderstreamOptions(Optional<ChatCompletionStreamOptions> streamOptions)Options for streaming response. final ChatCompletionCreateParams.BuilderstreamOptions(JsonField<ChatCompletionStreamOptions> streamOptions)Options for streaming response. final ChatCompletionCreateParams.Buildertemperature(Double temperature)What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Buildertemperature(Double temperature)What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Buildertemperature(Optional<Double> temperature)What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.Buildertemperature(JsonField<Double> temperature)What sampling temperature to use, between 0 and 2. final ChatCompletionCreateParams.BuildertoolChoice(ChatCompletionToolChoiceOption toolChoice)Controls which (if any) tool is called by the model. final ChatCompletionCreateParams.BuildertoolChoice(JsonField<ChatCompletionToolChoiceOption> toolChoice)Controls which (if any) tool is called by the model. final ChatCompletionCreateParams.BuildertoolChoice(ChatCompletionToolChoiceOption.Auto auto)nonemeans the model will not call any tool and instead generates a message.final ChatCompletionCreateParams.BuildertoolChoice(ChatCompletionNamedToolChoice namedToolChoice)Specifies a tool the model should use. final ChatCompletionCreateParams.Buildertools(List<ChatCompletionTool> tools)A list of tools the model may call. final ChatCompletionCreateParams.Buildertools(JsonField<List<ChatCompletionTool>> tools)A list of tools the model may call. final ChatCompletionCreateParams.BuilderaddTool(ChatCompletionTool tool)A list of tools the model may call. final ChatCompletionCreateParams.BuildertopLogprobs(Long topLogprobs)An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.BuildertopLogprobs(Long topLogprobs)An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.BuildertopLogprobs(Optional<Long> topLogprobs)An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.BuildertopLogprobs(JsonField<Long> topLogprobs)An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. final ChatCompletionCreateParams.BuildertopP(Double topP)An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.BuildertopP(Double topP)An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.BuildertopP(Optional<Double> topP)An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.BuildertopP(JsonField<Double> topP)An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. final ChatCompletionCreateParams.Builderuser(String user)A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. final ChatCompletionCreateParams.Builderuser(JsonField<String> user)A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. final ChatCompletionCreateParams.BuilderadditionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)final ChatCompletionCreateParams.BuilderputAdditionalBodyProperty(String key, JsonValue value)final ChatCompletionCreateParams.BuilderputAllAdditionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)final ChatCompletionCreateParams.BuilderremoveAdditionalBodyProperty(String key)final ChatCompletionCreateParams.BuilderremoveAllAdditionalBodyProperties(Set<String> keys)final ChatCompletionCreateParams.BuilderadditionalHeaders(Headers additionalHeaders)final ChatCompletionCreateParams.BuilderadditionalHeaders(Map<String, Iterable<String>> additionalHeaders)final ChatCompletionCreateParams.BuilderputAdditionalHeader(String name, String value)final ChatCompletionCreateParams.BuilderputAdditionalHeaders(String name, Iterable<String> values)final ChatCompletionCreateParams.BuilderputAllAdditionalHeaders(Headers additionalHeaders)final ChatCompletionCreateParams.BuilderputAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)final ChatCompletionCreateParams.BuilderreplaceAdditionalHeaders(String name, String value)final ChatCompletionCreateParams.BuilderreplaceAdditionalHeaders(String name, Iterable<String> values)final ChatCompletionCreateParams.BuilderreplaceAllAdditionalHeaders(Headers additionalHeaders)final ChatCompletionCreateParams.BuilderreplaceAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)final ChatCompletionCreateParams.BuilderremoveAdditionalHeaders(String name)final ChatCompletionCreateParams.BuilderremoveAllAdditionalHeaders(Set<String> names)final ChatCompletionCreateParams.BuilderadditionalQueryParams(QueryParams additionalQueryParams)final ChatCompletionCreateParams.BuilderadditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)final ChatCompletionCreateParams.BuilderputAdditionalQueryParam(String key, String value)final ChatCompletionCreateParams.BuilderputAdditionalQueryParams(String key, Iterable<String> values)final ChatCompletionCreateParams.BuilderputAllAdditionalQueryParams(QueryParams additionalQueryParams)final ChatCompletionCreateParams.BuilderputAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)final ChatCompletionCreateParams.BuilderreplaceAdditionalQueryParams(String key, String value)final ChatCompletionCreateParams.BuilderreplaceAdditionalQueryParams(String key, Iterable<String> values)final ChatCompletionCreateParams.BuilderreplaceAllAdditionalQueryParams(QueryParams additionalQueryParams)final ChatCompletionCreateParams.BuilderreplaceAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)final ChatCompletionCreateParams.BuilderremoveAdditionalQueryParams(String key)final ChatCompletionCreateParams.BuilderremoveAllAdditionalQueryParams(Set<String> keys)final ChatCompletionCreateParamsbuild()-
-
Method Detail
-
messages
final ChatCompletionCreateParams.Builder messages(List<ChatCompletionMessageParam> messages)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
messages
final ChatCompletionCreateParams.Builder messages(JsonField<List<ChatCompletionMessageParam>> messages)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionMessageParam message)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionDeveloperMessageParam developer)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer,
developermessages replace the previoussystemmessages.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionSystemMessageParam system)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer, use
developermessages for this purpose instead.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionUserMessageParam user)
Messages sent by an end user, containing prompts or additional context information.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionAssistantMessageParam assistant)
Messages sent by the model in response to user messages.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionMessage assistant)
Messages sent by the model in response to user messages.
-
addMessage
final ChatCompletionCreateParams.Builder addMessage(ChatCompletionToolMessageParam tool)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addMessage
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder addMessage(ChatCompletionFunctionMessageParam function)
A list of messages comprising the conversation so far. Depending on the model you use, different message types (modalities) are supported, like text, images, and audio.
-
addDeveloperMessage
final ChatCompletionCreateParams.Builder addDeveloperMessage(ChatCompletionDeveloperMessageParam.Content content)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer,
developermessages replace the previoussystemmessages.
-
addDeveloperMessage
final ChatCompletionCreateParams.Builder addDeveloperMessage(String text)
The contents of the developer message.
-
addDeveloperMessageOfArrayOfContentParts
final ChatCompletionCreateParams.Builder addDeveloperMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)
An array of content parts with a defined type. For developer messages, only type
textis supported.
-
addSystemMessage
final ChatCompletionCreateParams.Builder addSystemMessage(ChatCompletionSystemMessageParam.Content content)
Developer-provided instructions that the model should follow, regardless of messages sent by the user. With o1 models and newer, use
developermessages for this purpose instead.
-
addSystemMessage
final ChatCompletionCreateParams.Builder addSystemMessage(String text)
The contents of the system message.
-
addSystemMessageOfArrayOfContentParts
final ChatCompletionCreateParams.Builder addSystemMessageOfArrayOfContentParts(List<ChatCompletionContentPartText> arrayOfContentParts)
An array of content parts with a defined type. For system messages, only type
textis supported.
-
addUserMessage
final ChatCompletionCreateParams.Builder addUserMessage(ChatCompletionUserMessageParam.Content content)
Messages sent by an end user, containing prompts or additional context information.
-
addUserMessage
final ChatCompletionCreateParams.Builder addUserMessage(String text)
The text contents of the message.
-
addUserMessageOfArrayOfContentParts
final ChatCompletionCreateParams.Builder addUserMessageOfArrayOfContentParts(List<ChatCompletionContentPart> arrayOfContentParts)
An array of content parts with a defined type. Supported options differ based on the model being used to generate the response. Can contain text, image, or audio inputs.
-
model
final ChatCompletionCreateParams.Builder model(ChatModel model)
ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
-
model
final ChatCompletionCreateParams.Builder model(JsonField<ChatModel> model)
ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
-
model
final ChatCompletionCreateParams.Builder model(String value)
ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API.
-
audio
final ChatCompletionCreateParams.Builder audio(ChatCompletionAudioParam audio)
Parameters for audio output. Required when audio output is requested with
modalities: ["audio"]. Learn more.
-
audio
final ChatCompletionCreateParams.Builder audio(Optional<ChatCompletionAudioParam> audio)
Parameters for audio output. Required when audio output is requested with
modalities: ["audio"]. Learn more.
-
audio
final ChatCompletionCreateParams.Builder audio(JsonField<ChatCompletionAudioParam> audio)
Parameters for audio output. Required when audio output is requested with
modalities: ["audio"]. Learn more.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(Double frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(Double frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(Optional<Double> frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
frequencyPenalty
final ChatCompletionCreateParams.Builder frequencyPenalty(JsonField<Double> frequencyPenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(ChatCompletionCreateParams.FunctionCall functionCall)
Deprecated in favor of
tool_choice.Controls which (if any) function is called by the model.
nonemeans the model will not call a function and instead generates a message.automeans the model can pick between generating a message or calling a function.Specifying a particular function via
{"name": "my_function"}forces the model to call that function.noneis the default when no functions are present.autois the default if functions are present.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(JsonField<ChatCompletionCreateParams.FunctionCall> functionCall)
Deprecated in favor of
tool_choice.Controls which (if any) function is called by the model.
nonemeans the model will not call a function and instead generates a message.automeans the model can pick between generating a message or calling a function.Specifying a particular function via
{"name": "my_function"}forces the model to call that function.noneis the default when no functions are present.autois the default if functions are present.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(ChatCompletionCreateParams.FunctionCall.Auto auto)
nonemeans the model will not call a function and instead generates a message.automeans the model can pick between generating a message or calling a function.
-
functionCall
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functionCall(ChatCompletionFunctionCallOption functionCallOption)
Specifying a particular function via
{"name": "my_function"}forces the model to call that function.
-
functions
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functions(List<ChatCompletionCreateParams.Function> functions)
Deprecated in favor of
tools.A list of functions the model may generate JSON inputs for.
-
functions
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder functions(JsonField<List<ChatCompletionCreateParams.Function>> functions)
Deprecated in favor of
tools.A list of functions the model may generate JSON inputs for.
-
addFunction
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder addFunction(ChatCompletionCreateParams.Function function)
Deprecated in favor of
tools.A list of functions the model may generate JSON inputs for.
-
logitBias
final ChatCompletionCreateParams.Builder logitBias(ChatCompletionCreateParams.LogitBias logitBias)
Modify the likelihood of specified tokens appearing in the completion.
Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
-
logitBias
final ChatCompletionCreateParams.Builder logitBias(Optional<ChatCompletionCreateParams.LogitBias> logitBias)
Modify the likelihood of specified tokens appearing in the completion.
Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
-
logitBias
final ChatCompletionCreateParams.Builder logitBias(JsonField<ChatCompletionCreateParams.LogitBias> logitBias)
Modify the likelihood of specified tokens appearing in the completion.
Accepts a JSON object that maps tokens (specified by their token ID in the tokenizer) to an associated bias value from -100 to 100. Mathematically, the bias is added to the logits generated by the model prior to sampling. The exact effect will vary per model, but values between -1 and 1 should decrease or increase likelihood of selection; values like -100 or 100 should result in a ban or exclusive selection of the relevant token.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(Boolean logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
contentofmessage.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(Boolean logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
contentofmessage.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(Optional<Boolean> logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
contentofmessage.
-
logprobs
final ChatCompletionCreateParams.Builder logprobs(JsonField<Boolean> logprobs)
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the
contentofmessage.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(Long maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(Long maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(Optional<Long> maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxCompletionTokens
final ChatCompletionCreateParams.Builder maxCompletionTokens(JsonField<Long> maxCompletionTokens)
An upper bound for the number of tokens that can be generated for a completion, including visible output tokens and reasoning tokens.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(Long maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens, and is not compatible with o1 series models.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(Long maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens, and is not compatible with o1 series models.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(Optional<Long> maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens, and is not compatible with o1 series models.
-
maxTokens
@Deprecated(message = "deprecated") final ChatCompletionCreateParams.Builder maxTokens(JsonField<Long> maxTokens)
The maximum number of /tokenizer that can be generated in the chat completion. This value can be used to control costs for text generated via API.
This value is now deprecated in favor of
max_completion_tokens, and is not compatible with o1 series models.
-
metadata
final ChatCompletionCreateParams.Builder metadata(Metadata metadata)
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
-
metadata
final ChatCompletionCreateParams.Builder metadata(Optional<Metadata> metadata)
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
-
metadata
final ChatCompletionCreateParams.Builder metadata(JsonField<Metadata> metadata)
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format, and querying for objects via API or the dashboard.
Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
-
modalities
final ChatCompletionCreateParams.Builder modalities(List<ChatCompletionModality> modalities)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]The
gpt-4o-audio-previewmodel can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
modalities
final ChatCompletionCreateParams.Builder modalities(Optional<List<ChatCompletionModality>> modalities)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]The
gpt-4o-audio-previewmodel can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
modalities
final ChatCompletionCreateParams.Builder modalities(JsonField<List<ChatCompletionModality>> modalities)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]The
gpt-4o-audio-previewmodel can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
addModality
final ChatCompletionCreateParams.Builder addModality(ChatCompletionModality modality)
Output types that you would like the model to generate for this request. Most models are capable of generating text, which is the default:
["text"]The
gpt-4o-audio-previewmodel can also be used to generate audio. To request that this model generate both text and audio responses, you can use:["text", "audio"]
-
n
final ChatCompletionCreateParams.Builder n(Long n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
nas1to minimize costs.
-
n
final ChatCompletionCreateParams.Builder n(Long n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
nas1to minimize costs.
-
n
final ChatCompletionCreateParams.Builder n(Optional<Long> n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
nas1to minimize costs.
-
n
final ChatCompletionCreateParams.Builder n(JsonField<Long> n)
How many chat completion choices to generate for each input message. Note that you will be charged based on the number of generated tokens across all of the choices. Keep
nas1to minimize costs.
-
parallelToolCalls
final ChatCompletionCreateParams.Builder parallelToolCalls(Boolean parallelToolCalls)
Whether to enable parallel function calling during tool use.
-
parallelToolCalls
final ChatCompletionCreateParams.Builder parallelToolCalls(JsonField<Boolean> parallelToolCalls)
Whether to enable parallel function calling during tool use.
-
prediction
final ChatCompletionCreateParams.Builder prediction(ChatCompletionPredictionContent prediction)
Static predicted output content, such as the content of a text file that is being regenerated.
-
prediction
final ChatCompletionCreateParams.Builder prediction(Optional<ChatCompletionPredictionContent> prediction)
Static predicted output content, such as the content of a text file that is being regenerated.
-
prediction
final ChatCompletionCreateParams.Builder prediction(JsonField<ChatCompletionPredictionContent> prediction)
Static predicted output content, such as the content of a text file that is being regenerated.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(Double presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(Double presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(Optional<Double> presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
presencePenalty
final ChatCompletionCreateParams.Builder presencePenalty(JsonField<Double> presencePenalty)
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
-
reasoningEffort
final ChatCompletionCreateParams.Builder reasoningEffort(ChatCompletionReasoningEffort reasoningEffort)
o1 models only
Constrains effort on reasoning for reasoning models. Currently supported values are
low,medium, andhigh. Reducing reasoning effort can result in faster responses and fewer tokens used on reasoning in a response.
-
reasoningEffort
final ChatCompletionCreateParams.Builder reasoningEffort(JsonField<ChatCompletionReasoningEffort> reasoningEffort)
o1 models only
Constrains effort on reasoning for reasoning models. Currently supported values are
low,medium, andhigh. Reducing reasoning effort can result in faster responses and fewer tokens used on reasoning in a response.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ChatCompletionCreateParams.ResponseFormat responseFormat)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length", which indicates the generation exceededmax_tokensor the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(JsonField<ChatCompletionCreateParams.ResponseFormat> responseFormat)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length", which indicates the generation exceededmax_tokensor the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ResponseFormatText text)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length", which indicates the generation exceededmax_tokensor the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ResponseFormatJsonObject jsonObject)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length", which indicates the generation exceededmax_tokensor the conversation exceeded the max context length.
-
responseFormat
final ChatCompletionCreateParams.Builder responseFormat(ResponseFormatJsonSchema jsonSchema)
An object specifying the format that the model must output.
Setting to
{ "type": "json_schema", "json_schema": {...} }enables Structured Outputs which ensures the model will match your supplied JSON schema. Learn more in the Structured Outputs guide.Setting to
{ "type": "json_object" }enables JSON mode, which ensures the message the model generates is valid JSON.Important: when using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message. Without this, the model may generate an unending stream of whitespace until the generation reaches the token limit, resulting in a long-running and seemingly "stuck" request. Also note that the message content may be partially cut off if
finish_reason="length", which indicates the generation exceededmax_tokensor the conversation exceeded the max context length.
-
seed
final ChatCompletionCreateParams.Builder seed(Long seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seedand parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprintresponse parameter to monitor changes in the backend.
-
seed
final ChatCompletionCreateParams.Builder seed(Long seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seedand parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprintresponse parameter to monitor changes in the backend.
-
seed
final ChatCompletionCreateParams.Builder seed(Optional<Long> seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seedand parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprintresponse parameter to monitor changes in the backend.
-
seed
final ChatCompletionCreateParams.Builder seed(JsonField<Long> seed)
This feature is in Beta. If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same
seedand parameters should return the same result. Determinism is not guaranteed, and you should refer to thesystem_fingerprintresponse parameter to monitor changes in the backend.
-
serviceTier
final ChatCompletionCreateParams.Builder serviceTier(ChatCompletionCreateParams.ServiceTier serviceTier)
Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service:
If set to 'auto', and the Project is Scale tier enabled, the system will utilize scale tier credits until they are exhausted.
If set to 'auto', and the Project is not Scale tier enabled, the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
When not set, the default behavior is 'auto'.
-
serviceTier
final ChatCompletionCreateParams.Builder serviceTier(Optional<ChatCompletionCreateParams.ServiceTier> serviceTier)
Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service:
If set to 'auto', and the Project is Scale tier enabled, the system will utilize scale tier credits until they are exhausted.
If set to 'auto', and the Project is not Scale tier enabled, the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
When not set, the default behavior is 'auto'.
-
serviceTier
final ChatCompletionCreateParams.Builder serviceTier(JsonField<ChatCompletionCreateParams.ServiceTier> serviceTier)
Specifies the latency tier to use for processing the request. This parameter is relevant for customers subscribed to the scale tier service:
If set to 'auto', and the Project is Scale tier enabled, the system will utilize scale tier credits until they are exhausted.
If set to 'auto', and the Project is not Scale tier enabled, the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
If set to 'default', the request will be processed using the default service tier with a lower uptime SLA and no latency guarantee.
When not set, the default behavior is 'auto'.
-
stop
final ChatCompletionCreateParams.Builder stop(ChatCompletionCreateParams.Stop stop)
Up to 4 sequences where the API will stop generating further tokens.
-
stop
final ChatCompletionCreateParams.Builder stop(JsonField<ChatCompletionCreateParams.Stop> stop)
Up to 4 sequences where the API will stop generating further tokens.
-
stop
final ChatCompletionCreateParams.Builder stop(String string)
Up to 4 sequences where the API will stop generating further tokens.
-
stopOfStrings
final ChatCompletionCreateParams.Builder stopOfStrings(List<String> strings)
Up to 4 sequences where the API will stop generating further tokens.
-
store
final ChatCompletionCreateParams.Builder store(Boolean store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
store
final ChatCompletionCreateParams.Builder store(Boolean store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
store
final ChatCompletionCreateParams.Builder store(Optional<Boolean> store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
store
final ChatCompletionCreateParams.Builder store(JsonField<Boolean> store)
Whether or not to store the output of this chat completion request for use in our model distillation or evals products.
-
streamOptions
final ChatCompletionCreateParams.Builder streamOptions(ChatCompletionStreamOptions streamOptions)
Options for streaming response. Only set this when you set
stream: true.
-
streamOptions
final ChatCompletionCreateParams.Builder streamOptions(Optional<ChatCompletionStreamOptions> streamOptions)
Options for streaming response. Only set this when you set
stream: true.
-
streamOptions
final ChatCompletionCreateParams.Builder streamOptions(JsonField<ChatCompletionStreamOptions> streamOptions)
Options for streaming response. Only set this when you set
stream: true.
-
temperature
final ChatCompletionCreateParams.Builder temperature(Double temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_pbut not both.
-
temperature
final ChatCompletionCreateParams.Builder temperature(Double temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_pbut not both.
-
temperature
final ChatCompletionCreateParams.Builder temperature(Optional<Double> temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_pbut not both.
-
temperature
final ChatCompletionCreateParams.Builder temperature(JsonField<Double> temperature)
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or
top_pbut not both.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(ChatCompletionToolChoiceOption toolChoice)
Controls which (if any) tool is called by the model.
nonemeans the model will not call any tool and instead generates a message.automeans the model can pick between generating a message or calling one or more tools.requiredmeans the model must call one or more tools. Specifying a particular tool via{"type": "function", "function": {"name": "my_function"}}forces the model to call that tool.noneis the default when no tools are present.autois the default if tools are present.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(JsonField<ChatCompletionToolChoiceOption> toolChoice)
Controls which (if any) tool is called by the model.
nonemeans the model will not call any tool and instead generates a message.automeans the model can pick between generating a message or calling one or more tools.requiredmeans the model must call one or more tools. Specifying a particular tool via{"type": "function", "function": {"name": "my_function"}}forces the model to call that tool.noneis the default when no tools are present.autois the default if tools are present.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(ChatCompletionToolChoiceOption.Auto auto)
nonemeans the model will not call any tool and instead generates a message.automeans the model can pick between generating a message or calling one or more tools.requiredmeans the model must call one or more tools.
-
toolChoice
final ChatCompletionCreateParams.Builder toolChoice(ChatCompletionNamedToolChoice namedToolChoice)
Specifies a tool the model should use. Use to force the model to call a specific function.
-
tools
final ChatCompletionCreateParams.Builder tools(List<ChatCompletionTool> tools)
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
-
tools
final ChatCompletionCreateParams.Builder tools(JsonField<List<ChatCompletionTool>> tools)
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
-
addTool
final ChatCompletionCreateParams.Builder addTool(ChatCompletionTool tool)
A list of tools the model may call. Currently, only functions are supported as a tool. Use this to provide a list of functions the model may generate JSON inputs for. A max of 128 functions are supported.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(Long topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobsmust be set totrueif this parameter is used.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(Long topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobsmust be set totrueif this parameter is used.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(Optional<Long> topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobsmust be set totrueif this parameter is used.
-
topLogprobs
final ChatCompletionCreateParams.Builder topLogprobs(JsonField<Long> topLogprobs)
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
logprobsmust be set totrueif this parameter is used.
-
topP
final ChatCompletionCreateParams.Builder topP(Double topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperaturebut not both.
-
topP
final ChatCompletionCreateParams.Builder topP(Double topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperaturebut not both.
-
topP
final ChatCompletionCreateParams.Builder topP(Optional<Double> topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperaturebut not both.
-
topP
final ChatCompletionCreateParams.Builder topP(JsonField<Double> topP)
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or
temperaturebut not both.
-
user
final ChatCompletionCreateParams.Builder user(String user)
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. Learn more.
-
user
final ChatCompletionCreateParams.Builder user(JsonField<String> user)
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse. Learn more.
-
additionalBodyProperties
final ChatCompletionCreateParams.Builder additionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)
-
putAdditionalBodyProperty
final ChatCompletionCreateParams.Builder putAdditionalBodyProperty(String key, JsonValue value)
-
putAllAdditionalBodyProperties
final ChatCompletionCreateParams.Builder putAllAdditionalBodyProperties(Map<String, JsonValue> additionalBodyProperties)
-
removeAdditionalBodyProperty
final ChatCompletionCreateParams.Builder removeAdditionalBodyProperty(String key)
-
removeAllAdditionalBodyProperties
final ChatCompletionCreateParams.Builder removeAllAdditionalBodyProperties(Set<String> keys)
-
additionalHeaders
final ChatCompletionCreateParams.Builder additionalHeaders(Headers additionalHeaders)
-
additionalHeaders
final ChatCompletionCreateParams.Builder additionalHeaders(Map<String, Iterable<String>> additionalHeaders)
-
putAdditionalHeader
final ChatCompletionCreateParams.Builder putAdditionalHeader(String name, String value)
-
putAdditionalHeaders
final ChatCompletionCreateParams.Builder putAdditionalHeaders(String name, Iterable<String> values)
-
putAllAdditionalHeaders
final ChatCompletionCreateParams.Builder putAllAdditionalHeaders(Headers additionalHeaders)
-
putAllAdditionalHeaders
final ChatCompletionCreateParams.Builder putAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)
-
replaceAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAdditionalHeaders(String name, String value)
-
replaceAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAdditionalHeaders(String name, Iterable<String> values)
-
replaceAllAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAllAdditionalHeaders(Headers additionalHeaders)
-
replaceAllAdditionalHeaders
final ChatCompletionCreateParams.Builder replaceAllAdditionalHeaders(Map<String, Iterable<String>> additionalHeaders)
-
removeAdditionalHeaders
final ChatCompletionCreateParams.Builder removeAdditionalHeaders(String name)
-
removeAllAdditionalHeaders
final ChatCompletionCreateParams.Builder removeAllAdditionalHeaders(Set<String> names)
-
additionalQueryParams
final ChatCompletionCreateParams.Builder additionalQueryParams(QueryParams additionalQueryParams)
-
additionalQueryParams
final ChatCompletionCreateParams.Builder additionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
-
putAdditionalQueryParam
final ChatCompletionCreateParams.Builder putAdditionalQueryParam(String key, String value)
-
putAdditionalQueryParams
final ChatCompletionCreateParams.Builder putAdditionalQueryParams(String key, Iterable<String> values)
-
putAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder putAllAdditionalQueryParams(QueryParams additionalQueryParams)
-
putAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder putAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
-
replaceAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAdditionalQueryParams(String key, String value)
-
replaceAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAdditionalQueryParams(String key, Iterable<String> values)
-
replaceAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAllAdditionalQueryParams(QueryParams additionalQueryParams)
-
replaceAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder replaceAllAdditionalQueryParams(Map<String, Iterable<String>> additionalQueryParams)
-
removeAdditionalQueryParams
final ChatCompletionCreateParams.Builder removeAdditionalQueryParams(String key)
-
removeAllAdditionalQueryParams
final ChatCompletionCreateParams.Builder removeAllAdditionalQueryParams(Set<String> keys)
-
build
final ChatCompletionCreateParams build()
-
-
-
-