Class PutAzureaistudioRequest
java.lang.Object
co.elastic.clients.elasticsearch._types.RequestBase
co.elastic.clients.elasticsearch.inference.PutAzureaistudioRequest
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable
public class PutAzureaistudioRequest
extends RequestBase
implements JsonpSerializable
Create an Azure AI studio inference endpoint.
Create an inference endpoint to perform an inference task with the
azureaistudio service.
- See Also:
-
Nested Class Summary
Nested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBase
RequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>> -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutAzureaistudioRequest>Json deserializer forPutAzureaistudioRequeststatic final Endpoint<PutAzureaistudioRequest,PutAzureaistudioResponse, ErrorResponse> Endpoint "inference.put_azureaistudio". -
Method Summary
Modifier and TypeMethodDescriptionfinal StringRequired - The unique identifier of the inference endpoint.The chunking configuration object.static PutAzureaistudioRequestvoidserialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected voidserializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final AzureAiStudioServiceTypeservice()Required - The type of service supported for the specified task type.Required - Settings used to install the inference model.protected static voidSettings to configure the inference task.final AzureAiStudioTaskTypetaskType()Required - The type of the inference task that the model will perform.final Timetimeout()Specifies the amount of time to wait for the inference endpoint to be created.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBase
toString
-
Field Details
-
_DESERIALIZER
Json deserializer forPutAzureaistudioRequest -
_ENDPOINT
public static final Endpoint<PutAzureaistudioRequest,PutAzureaistudioResponse, _ENDPOINTErrorResponse> Endpoint "inference.put_azureaistudio".
-
-
Method Details
-
of
public static PutAzureaistudioRequest of(Function<PutAzureaistudioRequest.Builder, ObjectBuilder<PutAzureaistudioRequest>> fn) -
azureaistudioInferenceId
Required - The unique identifier of the inference endpoint.API name:
azureaistudio_inference_id -
chunkingSettings
The chunking configuration object.API name:
chunking_settings -
service
Required - The type of service supported for the specified task type. In this case,azureaistudio.API name:
service -
serviceSettings
Required - Settings used to install the inference model. These settings are specific to theopenaiservice.API name:
service_settings -
taskSettings
Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings -
taskType
Required - The type of the inference task that the model will perform.API name:
task_type -
timeout
Specifies the amount of time to wait for the inference endpoint to be created.API name:
timeout -
serialize
Serialize this object to JSON.- Specified by:
serializein interfaceJsonpSerializable
-
serializeInternal
-
setupPutAzureaistudioRequestDeserializer
protected static void setupPutAzureaistudioRequestDeserializer(ObjectDeserializer<PutAzureaistudioRequest.Builder> op)
-