Class PutContextualaiRequest
java.lang.Object
co.elastic.clients.elasticsearch._types.RequestBase
co.elastic.clients.elasticsearch.inference.PutContextualaiRequest
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable
public class PutContextualaiRequest
extends RequestBase
implements JsonpSerializable
Create an Contextual AI inference endpoint.
Create an inference endpoint to perform an inference task with the
contexualai service.
To review the available rerank models, refer to https://docs.contextual.ai/api-reference/rerank/rerank#body-model.
- See Also:
-
Nested Class Summary
Nested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBase
RequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>> -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutContextualaiRequest>Json deserializer forPutContextualaiRequeststatic final Endpoint<PutContextualaiRequest,PutContextualaiResponse, ErrorResponse> Endpoint "inference.put_contextualai". -
Method Summary
Modifier and TypeMethodDescriptionThe chunking configuration object.final StringRequired - The unique identifier of the inference endpoint.static PutContextualaiRequestvoidserialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected voidserializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final ContextualAIServiceTypeservice()Required - The type of service supported for the specified task type.Required - Settings used to install the inference model.protected static voidfinal ContextualAITaskSettingsSettings to configure the inference task.final TaskTypeContextualAItaskType()Required - The type of the inference task that the model will perform.final Timetimeout()Specifies the amount of time to wait for the inference endpoint to be created.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBase
toString
-
Field Details
-
_DESERIALIZER
Json deserializer forPutContextualaiRequest -
_ENDPOINT
public static final Endpoint<PutContextualaiRequest,PutContextualaiResponse, _ENDPOINTErrorResponse> Endpoint "inference.put_contextualai".
-
-
Method Details
-
of
public static PutContextualaiRequest of(Function<PutContextualaiRequest.Builder, ObjectBuilder<PutContextualaiRequest>> fn) -
chunkingSettings
The chunking configuration object.API name:
chunking_settings -
contextualaiInferenceId
Required - The unique identifier of the inference endpoint.API name:
contextualai_inference_id -
service
Required - The type of service supported for the specified task type. In this case,contextualai.API name:
service -
serviceSettings
Required - Settings used to install the inference model. These settings are specific to thecontextualaiservice.API name:
service_settings -
taskSettings
Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings -
taskType
Required - The type of the inference task that the model will perform.API name:
task_type -
timeout
Specifies the amount of time to wait for the inference endpoint to be created.API name:
timeout -
serialize
Serialize this object to JSON.- Specified by:
serializein interfaceJsonpSerializable
-
serializeInternal
-
setupPutContextualaiRequestDeserializer
protected static void setupPutContextualaiRequestDeserializer(ObjectDeserializer<PutContextualaiRequest.Builder> op)
-