Class PutVoyageaiRequest
java.lang.Object
co.elastic.clients.elasticsearch._types.RequestBase
co.elastic.clients.elasticsearch.inference.PutVoyageaiRequest
- All Implemented Interfaces:
- JsonpSerializable
@JsonpDeserializable
public class PutVoyageaiRequest
extends RequestBase
implements JsonpSerializable
Create a VoyageAI inference endpoint.
 
 Create an inference endpoint to perform an inference task with the
 voyageai service.
 
Avoid creating multiple endpoints for the same model unless required, as each endpoint consumes significant resources.
- See Also:
- 
Nested Class SummaryNested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBaseRequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>>
- 
Field SummaryFieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutVoyageaiRequest>Json deserializer forPutVoyageaiRequeststatic final Endpoint<PutVoyageaiRequest,PutVoyageaiResponse, ErrorResponse> Endpoint "inference.put_voyageai".
- 
Method SummaryModifier and TypeMethodDescriptionThe chunking configuration object.static PutVoyageaiRequestvoidserialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected voidserializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final VoyageAIServiceTypeservice()Required - The type of service supported for the specified task type.final VoyageAIServiceSettingsRequired - Settings used to install the inference model.protected static voidfinal VoyageAITaskSettingsSettings to configure the inference task.final VoyageAITaskTypetaskType()Required - The type of the inference task that the model will perform.final Timetimeout()Specifies the amount of time to wait for the inference endpoint to be created.final StringRequired - The unique identifier of the inference endpoint.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBasetoString
- 
Field Details- 
_DESERIALIZERJson deserializer forPutVoyageaiRequest
- 
_ENDPOINTEndpoint "inference.put_voyageai".
 
- 
- 
Method Details- 
ofpublic static PutVoyageaiRequest of(Function<PutVoyageaiRequest.Builder, ObjectBuilder<PutVoyageaiRequest>> fn) 
- 
chunkingSettingsThe chunking configuration object.API name: chunking_settings
- 
serviceRequired - The type of service supported for the specified task type. In this case,voyageai.API name: service
- 
serviceSettingsRequired - Settings used to install the inference model. These settings are specific to thevoyageaiservice.API name: service_settings
- 
taskSettingsSettings to configure the inference task. These settings are specific to the task type you specified.API name: task_settings
- 
taskTypeRequired - The type of the inference task that the model will perform.API name: task_type
- 
timeoutSpecifies the amount of time to wait for the inference endpoint to be created.API name: timeout
- 
voyageaiInferenceIdRequired - The unique identifier of the inference endpoint.API name: voyageai_inference_id
- 
serializeSerialize this object to JSON.- Specified by:
- serializein interface- JsonpSerializable
 
- 
serializeInternal
- 
setupPutVoyageaiRequestDeserializerprotected static void setupPutVoyageaiRequestDeserializer(ObjectDeserializer<PutVoyageaiRequest.Builder> op) 
 
-