Class PutElasticsearchRequest
- All Implemented Interfaces:
JsonpSerializable
Create an inference endpoint to perform an inference task with the
elasticsearch
service.
info Your Elasticsearch deployment contains preconfigured ELSER and E5 inference endpoints, you only need to create the enpoints using the API if you want to customize the settings.
If you use the ELSER or the E5 model through the elasticsearch
service, the API request will automatically download and deploy the model if
it isn't downloaded yet.
info You might see a 502 bad gateway error in the response when using the Kibana Console. This error usually just reflects a timeout, while the model downloads in the background. You can check the download progress in the Machine Learning UI. If using the Python client, you can set the timeout parameter to a higher value.
After creating the endpoint, wait for the model deployment to complete before
using it. To verify the deployment status, use the get trained model
statistics API. Look for
"state": "fully_allocated"
in the response
and ensure that the "allocation_count"
matches the
"target_allocation_count"
. Avoid creating multiple
endpoints for the same model unless required, as each endpoint consumes
significant resources.
- See Also:
-
Nested Class Summary
Nested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBase
RequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>>
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutElasticsearchRequest>
Json deserializer forPutElasticsearchRequest
static final Endpoint<PutElasticsearchRequest,
PutElasticsearchResponse, ErrorResponse> Endpoint "inference.put_elasticsearch
". -
Method Summary
Modifier and TypeMethodDescriptionThe chunking configuration object.final String
Required - The unique identifier of the inference endpoint.static PutElasticsearchRequest
void
serialize
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected void
serializeInternal
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final ElasticsearchServiceType
service()
Required - The type of service supported for the specified task type.Required - Settings used to install the inference model.protected static void
Settings to configure the inference task.final ElasticsearchTaskType
taskType()
Required - The type of the inference task that the model will perform.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBase
toString
-
Field Details
-
_DESERIALIZER
Json deserializer forPutElasticsearchRequest
-
_ENDPOINT
public static final Endpoint<PutElasticsearchRequest,PutElasticsearchResponse, _ENDPOINTErrorResponse> Endpoint "inference.put_elasticsearch
".
-
-
Method Details
-
of
public static PutElasticsearchRequest of(Function<PutElasticsearchRequest.Builder, ObjectBuilder<PutElasticsearchRequest>> fn) -
chunkingSettings
The chunking configuration object.API name:
chunking_settings
-
elasticsearchInferenceId
Required - The unique identifier of the inference endpoint. The must not match themodel_id
.API name:
elasticsearch_inference_id
-
service
Required - The type of service supported for the specified task type. In this case,elasticsearch
.API name:
service
-
serviceSettings
Required - Settings used to install the inference model. These settings are specific to theelasticsearch
service.API name:
service_settings
-
taskSettings
Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings
-
taskType
Required - The type of the inference task that the model will perform.API name:
task_type
-
serialize
Serialize this object to JSON.- Specified by:
serialize
in interfaceJsonpSerializable
-
serializeInternal
-
setupPutElasticsearchRequestDeserializer
protected static void setupPutElasticsearchRequestDeserializer(ObjectDeserializer<PutElasticsearchRequest.Builder> op)
-