Class PutOpenaiRequest
java.lang.Object
co.elastic.clients.elasticsearch._types.RequestBase
co.elastic.clients.elasticsearch.inference.PutOpenaiRequest
- All Implemented Interfaces:
JsonpSerializable
Create an OpenAI inference endpoint.
Create an inference endpoint to perform an inference task with the
openai
service or openai
compatible APIs.
- See Also:
-
Nested Class Summary
Nested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBase
RequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>>
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutOpenaiRequest>
Json deserializer forPutOpenaiRequest
static final Endpoint<PutOpenaiRequest,
PutOpenaiResponse, ErrorResponse> Endpoint "inference.put_openai
". -
Method Summary
Modifier and TypeMethodDescriptionThe chunking configuration object.static PutOpenaiRequest
final String
Required - The unique identifier of the inference endpoint.void
serialize
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected void
serializeInternal
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final OpenAIServiceType
service()
Required - The type of service supported for the specified task type.final OpenAIServiceSettings
Required - Settings used to install the inference model.protected static void
final OpenAITaskSettings
Settings to configure the inference task.final OpenAITaskType
taskType()
Required - The type of the inference task that the model will perform.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBase
toString
-
Field Details
-
_DESERIALIZER
Json deserializer forPutOpenaiRequest
-
_ENDPOINT
Endpoint "inference.put_openai
".
-
-
Method Details
-
of
public static PutOpenaiRequest of(Function<PutOpenaiRequest.Builder, ObjectBuilder<PutOpenaiRequest>> fn) -
chunkingSettings
The chunking configuration object.API name:
chunking_settings
-
openaiInferenceId
Required - The unique identifier of the inference endpoint.API name:
openai_inference_id
-
service
Required - The type of service supported for the specified task type. In this case,openai
.API name:
service
-
serviceSettings
Required - Settings used to install the inference model. These settings are specific to theopenai
service.API name:
service_settings
-
taskSettings
Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings
-
taskType
Required - The type of the inference task that the model will perform. NOTE: Thechat_completion
task type only supports streaming and only through the _stream API.API name:
task_type
-
serialize
Serialize this object to JSON.- Specified by:
serialize
in interfaceJsonpSerializable
-
serializeInternal
-
setupPutOpenaiRequestDeserializer
protected static void setupPutOpenaiRequestDeserializer(ObjectDeserializer<PutOpenaiRequest.Builder> op)
-