Class PutOpenaiRequest
java.lang.Object
co.elastic.clients.elasticsearch._types.RequestBase
co.elastic.clients.elasticsearch.inference.PutOpenaiRequest
- All Implemented Interfaces:
- JsonpSerializable
Create an OpenAI inference endpoint.
 
 Create an inference endpoint to perform an inference task with the
 openai service or openai compatible APIs.
- See Also:
- 
Nested Class SummaryNested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBaseRequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>>
- 
Field SummaryFieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutOpenaiRequest>Json deserializer forPutOpenaiRequeststatic final Endpoint<PutOpenaiRequest,PutOpenaiResponse, ErrorResponse> Endpoint "inference.put_openai".
- 
Method SummaryModifier and TypeMethodDescriptionThe chunking configuration object.static PutOpenaiRequestfinal StringRequired - The unique identifier of the inference endpoint.voidserialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected voidserializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final OpenAIServiceTypeservice()Required - The type of service supported for the specified task type.final OpenAIServiceSettingsRequired - Settings used to install the inference model.protected static voidfinal OpenAITaskSettingsSettings to configure the inference task.final OpenAITaskTypetaskType()Required - The type of the inference task that the model will perform.final Timetimeout()Specifies the amount of time to wait for the inference endpoint to be created.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBasetoString
- 
Field Details- 
_DESERIALIZERJson deserializer forPutOpenaiRequest
- 
_ENDPOINTEndpoint "inference.put_openai".
 
- 
- 
Method Details- 
ofpublic static PutOpenaiRequest of(Function<PutOpenaiRequest.Builder, ObjectBuilder<PutOpenaiRequest>> fn) 
- 
chunkingSettingsThe chunking configuration object.API name: chunking_settings
- 
openaiInferenceIdRequired - The unique identifier of the inference endpoint.API name: openai_inference_id
- 
serviceRequired - The type of service supported for the specified task type. In this case,openai.API name: service
- 
serviceSettingsRequired - Settings used to install the inference model. These settings are specific to theopenaiservice.API name: service_settings
- 
taskSettingsSettings to configure the inference task. These settings are specific to the task type you specified.API name: task_settings
- 
taskTypeRequired - The type of the inference task that the model will perform. NOTE: Thechat_completiontask type only supports streaming and only through the _stream API.API name: task_type
- 
timeoutSpecifies the amount of time to wait for the inference endpoint to be created.API name: timeout
- 
serializeSerialize this object to JSON.- Specified by:
- serializein interface- JsonpSerializable
 
- 
serializeInternal
- 
setupPutOpenaiRequestDeserializerprotected static void setupPutOpenaiRequestDeserializer(ObjectDeserializer<PutOpenaiRequest.Builder> op) 
 
-